796 resultados para Notion of code
Resumo:
Using examples from contempoary policy and business discourses, and exemplary historical texts dealing with the notion of value, I put forward an argument as to why a critical scholarship that draws on media history, language analysis, philosophy and political economy is necessary to understand the dynamics of what is being called 'the global knowledge economy'. I argue that the social changes associated with new modes of value determination are closely associated with new media form.
Resumo:
Acyl glucuronides are reactive metabolites of carboxylate drugs, able to undergo a number of reactions in vitro and in vivo, including isomerization via intramolecular rearrangement and covalent adduct formation with proteins. The intrinsic reactivity of a particular acyl glucuronide depends upon the chemical makeup of the drug moiety. The least reactive acyl glucuronide yet reported is valproic acid acyl glucuronide (VPA-G), which is the major metabolite of the antiepileptic agent valproic acid (VPA). In this study, we showed that both VPA-G and its rearrangement isomers (iso-VPA-G) interacted with bovine brain microtubular protein (MTP, comprised of 85% tubulin and 15% microtubule associated proteins {MAPs}). MTP was incubated with VPA, VPA-G and iso-VPA-G for 2 h at room temperature and pH 7.5 at various concentrations up to 4 mM. VPA-G and iso-VPA-G caused dose-dependent inhibition of assembly of MTP into microtubules, with 50% inhibition (IC50) values of 1.0 and 0.2 mM respectively, suggesting that iso-VPA-G has five times more inhibitory potential than VPA-G. VPA itself did not inhibit microtubule formation except at very high concentrations (greater than or equal to2 mM). Dialysis to remove unbound VPA-G and iso-VPA-G (prior to the assembly assay) diminished inhibition while not removing it. Comparison of covalent binding of VPA-G and iso-VPA-G (using [C-14]-labelled species) showed that adduct formation was much greater for iso-vTA-G. When [C-14]-iso-VPA-G was reacted with MTP in the presence of sodium cyanide (to stabilize glycation adducts), subsequent separation into tubulin and MAPs fractions by ion exchange chromatography revealed that 78 and 22% of the covalent binding occurred with the MAPs and tubulin fractions respectively. These experiments support the notion of both covalent and reversible binding playing parts in the inhibition of microtubule formation from MTP (though the acyl glucuronide of VPA is less important than its rearrangement isomers in this regard), and that both tubulin and (perhaps more importantly) MAPs form adducts with acyl glucuronides. (C) 2002 Elsevier Science Inc. All rights reserved.
Resumo:
The notion of implicature was first introduced by Paul Grice (1967, 1989), who defined it essentially as what is communicated less what is said. This definition contributed in part to the proliferation of a large number of different species of implicature by neo-Griceans. Relevance theorists have responded to this by proposing a shift back to the distinction between "explicit" & "implicit" meaning (corresponding to "explicature" & "implicature," respectively). However, they appear to have pared down the concept of implicature too much, ignoring phenomena that may be better treated as implicatures in their overgeneralization of the concept of explicature. These problems have their roots in the fact that explicit & implicit meaning intuitively overlap & thus do not provide a suitable basis for distinguishing implicature from other types of pragmatic phenomena. An alternative conceptualization of implicature based on the concept of "implying" with which Grice originally associated his notion of implicature is thus proposed. From this definition, it emerges that implicature constitutes something else inferred by the addressee that is not literally said by the speaker. Instead, it is meant in addition to what the speaker literally says & is consequently defeasible like all other types of pragmatic phenomena. 1 Figure, 60 References. Adapted from the source document
Resumo:
A new algebraic Bethe ansatz scheme is proposed to diagonalize classes of integrable models relevant to the description of Bose-Einstein condensation in dilute alkali gases. This is achieved by introducing the notion of Z-graded representations of the Yang-Baxter algebra. (C) 2003 American Institute of Physics.
Resumo:
Upper Devonian to Lower Carboniferous strata of the Campwyn Volcanics of east central Queensland preserve a substantial sequence of first-cycle volcaniclastic sedimentary and coeval volcanic rocks that record prolonged volcanic activity along the northern New England Fold Belt. The style and scale of volcanism varied with time, producing an Upper Devonian sequence of mafic volcano-sedimentary rocks overlain by a rhyolitic ignimbrite-dominated sequence that passes upward into a Lower Carboniferous limestone-bearing sedimentary sequence. We define two facies associations for the Campwyn Volcanics. A lower facies association is dominated by mafic volcanic-derived sedimentary breccias with subordinate primary mafic volcanic rocks comprising predominantly hyaloclastite and peperite. Sedimentary breccias record episodic and high energy, subaqueous depositional events with clastic material sourced from a mafic lava-dominated terrain. Some breccias contain a high proportion of attenuated dense, glassy mafic juvenile clasts, suggesting a syn-eruptive origin. The lower facies association coarsens upwards from a lithic sand-dominated sequence through a thick interval of pebble- to boulder-grade polymict volcaniclastic breccias, culminating in facies that demonstrate subaerial exposure. The silicic upper facies association marks a significant change in eruptive style, magma composition and the nature of eruptive sources, as well as the widespread development of subaerial depositional conditions. Crystal-rich, high-grade, low- to high-silica rhyolite ignimbrites dominate the base of this facies association. Biostratigraphic age controls indicate that the ignimbrite-bearing sequences are Famennian to lower-mid Tournaisian in age. The ignimbrites represent extra-caldera facies with individual units up to 40 m thick and mostly lacking coarse lithic breccias. Thick deposits of pyroclastic material interbedded with fine-grained siliceous sandstone and mudstone (locally radiolarian-bearing) were deposited from pyroclastic flows that crossed palaeoshorelines or represent syn-eruptive, resedimented pyroclastic material. Some block-bearing lithic-pumice-crystal breccias may also reflect more proximal subaqueous silicic explosive eruptions. Crystal-lithic sandstones interbedded with, and overlying the ignimbrites, contain abundant detrital volcanic quartz and feldspar derived from the pyroclastic deposits. Limestone is common in the upper part of the upper facies association, and several beds are oolitic (cf. Rockhampton Group of the Yarrol terrane). Overall, the upper facies association fines upward and is transgressive, recording a return to shallow-marine conditions. Palaeocurrent data from all stratigraphic levels in the Campwyn Volcanics indicate that the regional sediment-dispersal direction was to the northwest, and opposed to the generally accepted notion of easterly sediment dispersal from a volcanic arc source. The silicic upper facies association correlates in age and lithology to Early Carboniferous silicic volcanism in the Drummond (Cycle 1) and Burdekin Basins, Connors Arch, and in the Yarrol terranes of eastern Queensland. The widespread development of silicic volcanism in the Early Carboniferous indicates that silicic (rift-related) magmatism was not restricted to the Drummond Basin, but was part of a more substantial silicic igneous province.
Resumo:
The purpose of this study was to analyze the development of four 20 year-old elite hockey players through an in-depth examination of their sporting activities. The theoretical framework of deliberate practice (Ericsson, Krampe, & Tesch-Römer, 1993) and the notion of deliberate play (Côté, 1999) served as the theoretical foundations. Interviews were conducted to provide a longitudinal and detailed account of each participant's involvement in various sporting activities. The interviewer asked questions about the conditions and sporting activities for each year of development. The data obtained were validated through independent interviews conducted with three parents of three different athletes. The results were consistent with Côté's (1999) three stages of development in sport: the sampling (age 6-12), specializing (age 13-15), and investment (age 16+) years.
Resumo:
Borderline hypertension (BH) has been associated with an exaggerated blood pressure (BP) response during laboratory stressors. However, the incidence of target organ damage in this condition and its relation to BP hyperreactivity is an unsettled issue. Thus, we assessed the Doppler echocardiographic profile of a group of BH men (N = 36) according to office BP measurements with exaggerated BP in the cycloergometric test. A group of normotensive men (NT, N = 36) with a normal BP response during the cycloergometric test was used as control. To assess vascular function and reactivity, all subjects were submitted to the cold pressor test. Before Doppler echocardiography, the BP profile of all subjects was evaluated by 24-h ambulatory BP monitoring. All subjects from the NT group presented normal monitored levels of BP. In contrast, 19 subjects from the original BH group presented normal monitored BP levels and 17 presented elevated monitored BP levels. In the NT group all Doppler echocardiographic indexes were normal. All subjects from the original BH group presented normal left ventricular mass and geometrical pattern. However, in the subjects with elevated monitored BP levels, fractional shortening was greater, isovolumetric relaxation time longer, and early to late flow velocity ratio was reduced in relation to subjects from the original BH group with normal monitored BP levels (P<0.05). These subjects also presented an exaggerated BP response during the cold pressor test. These results support the notion of an integrated pattern of cardiac and vascular adaptation during the development of hypertension.
Resumo:
Graphical user interfaces (GUIs) are critical components of today's software. Developers are dedicating a larger portion of code to implementing them. Given their increased importance, correctness of GUIs code is becoming essential. This paper describes the latest results in the development of GUISurfer, a tool to reverse engineer the GUI layer of interactive computing systems. The ultimate goal of the tool is to enable analysis of interactive system from source code.
Resumo:
More and more current software systems rely on non trivial coordination logic for combining autonomous services typically running on different platforms and often owned by different organizations. Often, however, coordination data is deeply entangled in the code and, therefore, difficult to isolate and analyse separately. COORDINSPECTOR is a software tool which combines slicing and program analysis techniques to isolate all coordination elements from the source code of an existing application. Such a reverse engineering process provides a clear view of the actually invoked services as well as of the orchestration patterns which bind them together. The tool analyses Common Intermediate Language (CIL) code, the native language of Microsoft .Net Framework. Therefore, the scope of application of COORDINSPECTOR is quite large: potentially any piece of code developed in any of the programming languages which compiles to the .Net Framework. The tool generates graphical representations of the coordination layer together and identifies the underlying business process orchestrations, rendering them as Orc specifications
Resumo:
The significant number of publications describing unsuccessful cases in the introduction of health information systems makes it advisable to analyze the factors that may be contributing to such failures. However, the very notion of success is not equally assumed in all publications. Based in a literature review, the authors argue that the introduction of systems must be based in an eclectic combination of knowledge fields, adopting methodologies that strengthen the role of organizational culture and human resources in this project, as a whole. On the other hand, the authors argue that the introduction of systems should be oriented by a previously defined matrix of factors, against which the success can be measured.
Resumo:
In the category of Hom-Leibniz algebras we introduce the notion of Hom-corepresentation as adequate coefficients to construct the chain complex from which we compute the Leibniz homology of Hom-Leibniz algebras. We study universal central extensions of Hom-Leibniz algebras and generalize some classical results, nevertheless it is necessary to introduce new notions of α-central extension, universal α-central extension and α-perfect Hom-Leibniz algebra due to the fact that the composition of two central extensions of Hom-Leibniz algebras is not central. We also provide the recognition criteria for these kind of universal central extensions. We prove that an α-perfect Hom-Lie algebra admits a universal α-central extension in the categories of Hom-Lie and Hom-Leibniz algebras and we obtain the relationships between both of them. In case α = Id we recover the corresponding results on universal central extensions of Leibniz algebras.
Resumo:
As the amount of debt has gradually increased, particularly in recent years, Portugal is currently one of the European countries exhibiting one of the highest levels of overall indebtedness, including in both sovereign and private sectors. Indeed, this condition is the outcome of increasing levels of debt assumed not only by the government, but also by companies and families, being the later mostly due to mortgage loans and due charges. This paper focuses on the study of borrowing by Portuguese households. The research has been made in respect to the notion of debt, the consequences of recent developments in debt, among other factors. In order to analyse the factors that are most associated with debt, a study was developed using two multiple regression models, one using a longer time series and another shorter, evaluating the effect of several variables, such as consumption, savings, unemployment, inflation and interest rates, in order to check whether they could be associated with a higher level of debt.
Resumo:
Despite the still present hegemony of the structural-functionalist orthodoxy, the mid 1980's witnesses the insurgence of new philosophical approaches. This body of work had become a vital intellectual and ideological resource for those who wanted to confront the functionalist dominance in organization studies, such as structuration theory, labour process theory and neoinstitutionalist theory. The purpose of this paper is to review the incorporation of Bourdieu's work into neoinstitutionalism. I argue that this appropriation has resulted in a significant loss of theoretical strength. By giving place to the cognitivist metaphors of mental models, "scripts" and "schemas", instead of adopting the notion of habitus, neoinstitutionalism reinforces some of the ever-present dichotomies in social sciences, especially those of agency/structure and individual/society. While neoinstitutionalism was refining the cognitive approach in the 1990's, Bourdieu was moving towards psychoanalysis. Some indications for future research are provided in the concluding notes.
The Experience of the Religious through Silent Moving Image and the Silence of Bill Viola's Passions
Resumo:
With the creationof the moving image at the end of the 19th century a new way of representing and expressing the Religious was born. The cinema industry rapidly understood that film has a powerful way to attract new audiences and transformed the explicit religious message into an implicit theological discourse of the fictional film. Today, the concept of "cinema" needs to be rethought and expanded, as well as the notion of "tTranscendental" since the strong reality effect of the film can allow a true religious experience for the spectator.
Resumo:
One of the major problems that prevents the spread of elections with the possibility of remote voting over electronic networks, also called Internet Voting, is the use of unreliable client platforms, such as the voter's computer and the Internet infrastructure connecting it to the election server. A computer connected to the Internet is exposed to viruses, worms, Trojans, spyware, malware and other threats that can compromise the election's integrity. For instance, it is possible to write a virus that changes the voter's vote to a predetermined vote on election's day. Another possible attack is the creation of a fake election web site where the voter uses a malicious vote program on the web site that manipulates the voter's vote (phishing/pharming attack). Such attacks may not disturb the election protocol, therefore can remain undetected in the eyes of the election auditors. We propose the use of Code Voting to overcome insecurity of the client platform. Code Voting consists in creating a secure communication channel to communicate the voter's vote between the voter and a trusted component attached to the voter's computer. Consequently, no one controlling the voter's computer can change the his/her's vote. The trusted component can then process the vote according to a cryptographic voting protocol to enable cryptographic verification at the server's side.