916 resultados para Advent of christianity


Relevância:

90.00% 90.00%

Publicador:

Resumo:

Innovation and internationalization in services are key drivers of structural transformation, productivity growth and overall economic performance in Latin America. The services sector accounts for two thirds of the region’s GDP and provides over 60% of its employment. These shares are higher than in other developing regions, but still lower than in countries with higher levels of per capita income. The spread of information and communication technologies in Latin America over the past three decades has vastly enhanced both the tradability of services and the sector’s propensity to innovate. Long considered unrelated processes, both internationalization and innovation are today widely recognized as key and complementary sources of firm-level competitiveness and human capital enhancement. The advent of many novel types of business and consumer services is furthermore a key factor in the rising insertion of Latin American firms in regional and global value chains and transnational production networks, which are now the predominant form of organization of international production and trade. This volume explores three different levels of interaction between internationalization and innovation in the services sector in Latin America. Part I analyses the role of services in manufacturing and other sectors’ global value chains from a theoretical perspective, drawing on the experiences of Brazil and Mexico. Part II reviews innovation and internationalization policies and their effects on the performance of the services sector. Part III presents a series of case studies on innovation and internationalization linkages in Brazil, Chile, Costa Rica and Mexico. The book concludes that, in order for Latin American countries and firms to upgrade into services value chains, public and private initiatives must generate a host of regional public goods —enhanced investment climates, supply of skills, greater access to finance, improved protection of intellectual property, better value appropriation, enhanced efforts at standardization and quality certification— to strengthen the links between innovation and internationalization.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Despite the advent of improved pharmacological treatments to alleviate substance-related desires, psychological approaches will continue to be required. However, the current psychological treatment that most specifically focuses on desires and their management-cue exposure (CE)-has not lived up to its original promise. This paper argues that current psychological approaches to desire do not adequately incorporate our knowledge about the factors that trigger, maintain, and terminate episodes of desire. It asserts that the instigation and maintenance of desires involve both associative and elaborative processes. Understanding the processes triggering the initiation of intrusive thoughts may assist in preventing some episodes, but occasional intrusions will be inevitable. A demonstration of the ineffectiveness of thought suppression may discourage its use as a coping strategy for desire-related intrusions, and mindfulness meditation plus cognitive therapy may help in accepting their occurrence and letting them go. Competing tasks may be used to reduce elaboration of desires, and competing sensory images may have particular utility. The application of these procedures during episodes that are elicited in the clinic may allow the acquisition of more effective strategies to address desires in the natural environment. (C) 2004 Elsevier Ltd. All rights reserved.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Various marker systems exist for genetic analysis of horticultural species. Isozymes were first applied to the woody perennial nut crop, macadamia, in the early 1990s. The advent of DNA markers saw the development, for macadamia, of STMS (sequence-tagged microsatellite site), RAPD (randomly amplified polymorphic DNA), and RAF (randomly amplified DNA fingerprinting). The RAF technique typically generates dominant markers, but within the dominant marker profiles, certain primers also amplify multi-allelic co-dominant markers that are suspected to be microsatellites. In this paper, we confirm this for one such marker, and describe how RAF primers can be chosen that amplify one or more putative microsatellites. This approach of genotyping anonymous microsatellite markers via RAF is designated RAMiFi (randomly amplified microsatellite fingerprinting). Several marker systems were compared for the type, amount, and cost-efficiency of the information generated, using data from published studies on macadamia. The markers were also compared for the way they clustered a common set of accessions. The RAMiFi approach was identified as the most efficient and economical. The availability of such a versatile tool offers many advantages for the genetic characterisation of horticultural species.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The therapeutic letter has a long history, with roots in psychoanalytic work and continuing application in family therapy. The advent of e-mail has allowed another form for therapeutic written communication which, while incorporating the benefits of therapeutic letters, adds to these. It has also opened up some potential risks. This article incorporates a brief review of the literature covering therapeutic written communication and offers a case example where e-mail was used as an adjunct in face-to-face therapy with a client who experienced attachment difficulties. This therapy was informed by systemic and psychoanalytic traditions. The authors explore a variety of technical matters including the timing and Crafting of e-mail responses, the integration of written communication with face-to-face therapy, impact on the therapeutic relationship and management of crisis. Ethical issues such as confidentiality and duty of care are also considered.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The advent of molecular markers as a tool to aid selection has provided plant breeders with the opportunity to rapidly deliver superior genetic solutions to problems in agricultural production systems. However, a major constraint to the implementation of marker-assisted selection (MAS) in pragmatic breeding programs in the past has been the perceived high relative cost of MAS compared to conventional phenotypic selection. In this paper, computer simulation was used to design a genetically effective and economically efficient marker-assisted breeding strategy aimed at a specific outcome. Under investigation was a strategy involving the integration of both restricted backcrossing and doubled haploid (DH) technology. The point at which molecular markers are applied in a selection strategy can be critical to the effectiveness and cost efficiency of that strategy. The application of molecular markers was considered at three phases in the strategy: allele enrichment in the BC1F1 population, gene selection at the haploid stage and the selection for recurrent parent background of DHs prior to field testing. Overall, incorporating MAS at all three stages was the most effective, in terms of delivering a high frequency of desired outcomes and at combining the selected favourable rust resistance, end use quality and grain yield alleles. However, when costs were included in the model the combination of MAS at the BC1F1 and haploid stage was identified as the optimal strategy. A detailed economic analysis showed that incorporation of marker selection at these two stages not only increased genetic gain over the phenotypic alternative but actually reduced the over all cost by 40%.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

William Aldren Turner (1864-1945), in his day Physician to the National Hospital, Queen Square, and to King's College Hospital, London, was one of the major figures in the world of epileptology in the period between Hughlings Jackson in the latter part of the 19th century and the advent of electroencephalography in the 1930s. Although he also made contributions to knowledge in other areas of neurology, and with Grainger Stewart wrote a competent textbook on that subject, Turner's main professional interest throughout his career seems to have been epilepsy. On the basis of a series of earlier, rather heavily statistical, personal publications dealing with various aspects of the disorder, he authored what became a well-accepted monograph entitled Epilepsy-a study of the idiopathic disorder, which appeared in 1907, and he also gave the 1910 Morison lectures in Edinburgh on the topic. His writings on epilepsy over a period of three decades consolidated knowledge rather than led to significant advances, but helped maintain interest in the disorder during a rather long fallow phase in the development of the understanding of its nature. (C) 2005 Elsevier Ltd. All rights reserved.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Adrenomedullin (AM), a potent vasoactive peptide, is elevated in certain disease states such as sepsis. Its role as a physiologically relevant peptide has been confirmed with the advent of the homozygous lethal AM peptide knockout mouse. So far, there have been few and conflicting studies which examine the regulatory role of AM at the receptor level. In this article, we discuss the few studies that have been presented on the desensitisation of AM receptors and also present novel data on the desensitisation of endogenous AM receptors in Rat-2 fibroblasts. © 2003 Elsevier Science B.V. All rights reserved.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

With the advent of globalisation companies all around the world must improve their performance in order to survive. The threats are coming from everywhere, and in different ways, such as low cost products, high quality products, new technologies, and new products. Different companies in different countries are using various techniques and using quality criteria items to strive for excellence. Continuous improvement techniques are used to enable companies to improve their operations. Therefore, companies are using techniques such as TQM, Kaizen, Six-Sigma, Lean Manufacturing, and quality award criteria items such as Customer Focus, Human Resources, Information & Analysis, and Process Management. The purpose of this paper is to compare the use of these techniques and criteria items in two countries, Mexico and the United Kingdom, which differ in culture and industrial structure. In terms of the use of continuous improvement tools and techniques, Mexico formally started to deal with continuous improvement by creating its National Quality Award soon after the Americans began the Malcolm Baldrige National Quality Award. The United Kingdom formally started by using the European Quality Award (EQA), modified and renamed as the EFQM Excellence Model. The methodology used in this study was to undertake a literature review of the subject matter and to study some general applications around the world. A questionnaire survey was then designed and a survey undertaken based on the same scale, about the same sample size, and the about the same industrial sector within the two countries. The survey presents a brief definition of each of the constructs to facilitate understanding of the questions. The analysis of the data was then conducted with the assistance of a statistical software package. The survey results indicate both similarities and differences in the strengths and weaknesses of the companies in the two countries. One outcome of the analysis is that it enables the companies to use the results to benchmark themselves and thus act to reinforce their strengths and to reduce their weaknesses.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The advent of Internet banking and phone banking is changing the role of bank branches from a predominantly transaction-based one to a sales-oriented role. This paper reports on an assessment of the branches of a Portuguese bank in terms of their performance in their new roles in three different areas: Their efficiency in fostering the use of new transaction channels, their efficiency in increasing sales and their customer base, and their efficiency in generating profits. Service quality is also a major issue in service organisations like bank branches, and therefore we analyse the way this dimension of performance has been accounted for in the literature and take it into account in our empirical application. We have used data envelopment analysis (DEA) for the different performance assessments, but we depart from traditional DEA models in some cases. Performance comparisons on each dimension allowed us to identify benchmark bank branches and also problematic bank branches. In addition, we found positive links between operational and profit efficiency and also between transactional and operational efficiency. Service quality is positively related with operational and profit efficiency. © 2006 Elsevier B.V. All rights reserved.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Synaptic plasticity is the dynamic regulation of the strength of synaptic communication between nerve cells. It is central to neuronal development as well as experience-dependent remodeling of the adult nervous system as occurs during memory formation. Aberrant forms of synaptic plasticity also accompany a variety of neurological and psychiatric diseases, and unraveling the biological basis of synaptic plasticity has been a major goal in neurobiology research. The biochemical and structural mechanisms underlying different forms of synaptic plasticity are complex, involving multiple signaling cascades, reconfigurations of structural proteins and the trafficking of synaptic proteins. As such, proteomics should be a valuable tool in dissecting the molecular events underlying normal and disease-related forms of plasticity. In fact, progress in this area has been disappointingly slow. We discuss the particular challenges associated with proteomic interrogation of synaptic plasticity processes and outline ways in which we believe proteomics may advance the field over the next few years. We pay particular attention to technical advances being made in small sample proteomics and the advent of proteomic imaging in studying brain plasticity.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The roots of the concept of cortical columns stretch far back into the history of neuroscience. The impulse to compartmentalise the cortex into functional units can be seen at work in the phrenology of the beginning of the nineteenth century. At the beginning of the next century Korbinian Brodmann and several others published treatises on cortical architectonics. Later, in the middle of that century, Lorente de No writes of chains of ‘reverberatory’ neurons orthogonal to the pial surface of the cortex and called them ‘elementary units of cortical activity’. This is the first hint that a columnar organisation might exist. With the advent of microelectrode recording first Vernon Mountcastle (1957) and then David Hubel and Torsten Wiesel provided evidence consistent with the idea that columns might constitute units of physiological activity. This idea was backed up in the 1970s by clever histochemical techniques and culminated in Hubel and Wiesel’s well-known ‘ice-cube’ model of the cortex and Szentogathai’s brilliant iconography. The cortical column can thus be seen as the terminus ad quem of several great lines of neuroscientific research: currents originating in phrenology and passing through cytoarchitectonics; currents originating in neurocytology and passing through Lorente de No. Famously, Huxley noted the tragedy of a beautiful hypothesis destroyed by an ugly fact. Famously, too, human visual perception is orientated toward seeing edges and demarcations when, perhaps, they are not there. Recently the concept of cortical columns has come in for the same radical criticism that undermined the architectonics of the early part of the twentieth century. Does history repeat itself? This paper reviews this history and asks the question.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In this contribution I look at three episodes in the history of neurophysiology that bring out the complex relationship between seeing and believing. I start with Vesalius in the mid-sixteenth century who writes that he can in no way see any cavity in nerves, even in the optic nerves. He thus questions the age-old theory (dating back to the Alexandrians in the third century BC) but, because of the overarching psychophysiology of his time, does not press his case. This conflict between observation and theory persisted for a quarter of a millennium until finally resolved at the beginning of the nineteenth century by the discoveries of Galvani and Volta. The second case is provided by the early history of retinal synaptology. Schultze in 1866 had represented rod spherules and bipolar dendrites in the outer plexiform layer as being separated by a (synaptic) gap, yet in his written account, because of his theoretical commitments, held them to be continuous. Cajal later, 1892, criticized Schultze for this pusillanimity, but his own figure in La Cellule is by no means clear. It was only with the advent of the electron microscopy in the mid-twentieth century that the true complexity of the junction was revealed and it was shown that both investigators were partially right. My final example comes from the Hodgkin-Huxley biophysics of the 1950s. Their theory of the action potential depended on the existence of unseen ion pores with quite complex biophysical characteristics. These were not seen until the Nobel-Prize-winning X-ray diffraction analyses of the early twenty-first century. Seeing, even at several removes, then confirmed Hodgkin and Huxley’s belief. The relation between seeing and believing is by no means straightforward.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

With the advent of distributed computer systems with a largely transparent user interface, new questions have arisen regarding the management of such an environment by an operating system. One fertile area of research is that of load balancing, which attempts to improve system performance by redistributing the workload submitted to the system by the users. Early work in this field concentrated on static placement of computational objects to improve performance, given prior knowledge of process behaviour. More recently this has evolved into studying dynamic load balancing with process migration, thus allowing the system to adapt to varying loads. In this thesis, we describe a simulated system which facilitates experimentation with various load balancing algorithms. The system runs under UNIX and provides functions for user processes to communicate through software ports; processes reside on simulated homogeneous processors, connected by a user-specified topology, and a mechanism is included to allow migration of a process from one processor to another. We present the results of a study of adaptive load balancing algorithms, conducted using the aforementioned simulated system, under varying conditions; these results show the relative merits of different approaches to the load balancing problem, and we analyse the trade-offs between them. Following from this study, we present further novel modifications to suggested algorithms, and show their effects on system performance.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The central argument to this thesis is that the nature and purpose of corporate reporting has changed over time to become a more outward looking and forward looking document designed to promote the company and its performance to a wide range of shareholders, rather than merely to report to its owners upon past performance. it is argued that the discourse of environmental accounting and reporting is one driver for this change but that this discourse has been set up as in conflicting with the discourse of traditional accounting and performance measurement. The effect of this opposition between the discourses is that the two have been interpreted to be different and incompatible dimensions of performance with good performance along one dimension only being achievable through a sacrifice of performance along the other dimension. Thus a perceived dialectic in performance is believed to exist. One of the principal purposes of this thesis is to explore this perceived dialectic and, through analysis, to show that it does not exist and that there is not incompatibility. This exploration and analysis is based upon an investigation of the inherent inconsistencies in such corporate reports and the analysis makes use of both a statistical analysis and a semiotic analysis of corporate reports and the reported performance of companies along these dimensions. Thus the development of a semiology of corporate reporting is one of the significant outcomes of this thesis. A further outcome is a consideration of the implications of the analysis for corporate performance and its measurement. The thesis concludes with a consideration of the way in which the advent of electronic reporting may affect the ability of organisations to maintain the dialectic and the implications for corporate reporting.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The advent of DNA vaccines has heralded a new technology allowing the design and elicitation of immune responses more adequate for a wider range of pathogens. The formulation of these vaccines into the desired dosage forms extends their capability in terms of stability, routes of administration and efficacy. This thesis describes an investigation into the fabrication of plasmid DNA, the active principle of DNA vaccines, into microspheres, based on the tenet of an increased cellular uptake of microparticulate matter by phagocytic cells. The formulation of plasmid DNA into microspheres using two methods, is presented. Formulation of microspheric plasmid DNA using the double emulsion solvent evaporation method and a spray-drying method was explored. The former approach involves formation of a double emulsion, by homogenisation. This method produced microspheres of uniform size and smooth morphology, but had a detrimental effect on the formulated DNA. The spray-drying method resulted in microspheres with an improved preservation of DNA stability. The use of polyethylenimine (PEI) and stearylamine (SA) as agents in the microspheric formulation of plasmid DNA is a novel approach to DNA vaccine design. Using these molecules as model positively-charged agents, their influence on the characteristics of the microspheric formulations was investigated. PEI improved the entrapment efficiency of the plasmid DNA in microspheres, and has minimal effect on either the surface charge, morphology or size distribution of the formulations. Stearylamine effected an increase in the entrapment efficiency and stability of the plasmid DNA and its effect on the micropshere morphology was dependent on the method of preparation. The differences in the effects of the two molecules on microsphere formulations may be attributable to their dissimilar physico-chemical properties. PEI is water-soluble and highly-branched, while SA is hydrophobic and amphipathic. The positive charge of both molecules is imparted by amine functional groups. Preliminary data on the in vivo application of formulated DNA vaccine, using hepatitis B plasmid, showed superior humoral responses to the formulated antigen, compared with free (unformulated) antigen.