982 resultados para DETERMINES


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Three simulations of evapotranspiration were done with two values of time step,viz 10 min and one day. Inputs to the model were weather data, including directly measured upward and downward radiation, and soil characteristics. Three soils were used for each simulation. Analysis of the results shows that the time step has a direct influence on the prediction of potential evapotranspiration, but a complex interaction of this effect with the soil moisture characteristic, rate of increase of ground cover and bare soil evaporation determines the actual transpiration predicted. The results indicate that as small a time step as possible should be used in the simulation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Following the method of Ioffe and Smilga, the propagation of the baryon current in an external constant axial-vector field is considered. The close similarity of the operator-product expansion with and without an external field is shown to arise from the chiral invariance of gauge interactions in perturbation theory. Several sum rules corresponding to various invariants both for the nucleon and the hyperons are derived. The analysis of the sum rules is carried out by two independent methods, one called the ratio method and the other called the continuum method, paying special attention to the nondiagonal transitions induced by the external field between the ground state and excited states. Up to operators of dimension six, two new external-field-induced vacuum expectation values enter the calculations. Previous work determining these expectation values from PCAC (partial conservation of axial-vector current) are utilized. Our determination from the sum rules of the nucleon axial-vector renormalization constant GA, as well as the Cabibbo coupling constants in the SU3-symmetric limit (ms=0), is in reasonable accord with the experimental values. Uncertainties in the analysis are pointed out. The case of broken flavor SU3 symmetry is also considered. While in the ratio method, the results are stable for variation of the fiducial interval of the Borel mass parameter over which the left-hand side and the right-hand side of the sum rules are matched, in the continuum method the results are less stable. Another set of sum rules determines the value of the linear combination 7F-5D to be ≊0, or D/(F+D)≊(7/12). .AE

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Consider a general regression model with an arbitrary and unknown link function and a stochastic selection variable that determines whether the outcome variable is observable or missing. The paper proposes U-statistics that are based on kernel functions as estimators for the directions of the parameter vectors in the link function and the selection equation, and shows that these estimators are consistent and asymptotically normal.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This study examines both theoretically an empirically how well the theories of Norman Holland, David Bleich, Wolfgang Iser and Stanley Fish can explain readers' interpretations of literary texts. The theoretical analysis concentrates on their views on language from the point of view of Wittgenstein's Philosophical Investigations. This analysis shows that many of the assumptions related to language in these theories are problematic. The empirical data show that readers often form very similar interpretations. Thus the study challenges the common assumption that literary interpretations tend to be idiosyncratic. The empirical data consists of freely worded written answers to questions on three short stories. The interpretations were made by 27 Finnish university students. Some of the questions addressed issues that were discussed in large parts of the texts, some referred to issues that were mentioned only in passing or implied. The short stories were "The Witch à la Mode" by D. H. Lawrence, "Rain in the Heart" by Peter Taylor and "The Hitchhiking Game" by Milan Kundera. According to Fish, readers create both the formal features of a text and their interpretation of it according to an interpretive strategy. People who agree form an interpretive community. However, a typical answer usually contains ideas repeated by several readers as well as observations not mentioned by anyone else. Therefore it is very difficult to determine which readers belong to the same interpretive community. Moreover, readers with opposing opinions often seem to pay attention to the same textual features and even acknowledge the possibility of an opposing interpretation; therefore they do not seem to create the formal features of the text in different ways. Iser suggests that an interpretation emerges from the interaction between the text and the reader when the reader determines the implications of the text and in this way fills the "gaps" in the text. Iser believes that the text guides the reader, but as he also believes that meaning is on a level beyond words, he cannot explain how the text directs the reader. The similarity in the interpretations and the fact that the agreement is strongest when related to issues that are discussed broadly in the text do, however, support his assumption that readers are guided by the text. In Bleich's view, all interpretations have personal motives and each person has an idiosyncratic language system. The situation where a person learns a word determines the most important meaning it has for that person. In order to uncover the personal etymologies of words, Bleich asks his readers to associate freely on the basis of a text and note down all the personal memories and feelings that the reading experience evokes. Bleich's theory of the idiosyncratic language system seems to rely on a misconceived notion of the role that ostensive definitions have in language use. The readers' responses show that spontaneous associations to personal life seem to colour the readers' interpretations, but such instances are rather rare. According to Holland, an interpretation reflects the reader's identity theme. Language use is regulated by shared rules, but everyone follows the rules in his or her own way. Words mean different things to different people. The problem with this view is that if there is any basis for language use, it seems to be the shared way of following linguistic rules. Wittgenstein suggests that our understanding of words is related to the shared ways of using words and our understanding of human behaviour. This view seems to give better grounds for understanding similarity and differences in literary interpretations than the theories of Holland, Bleich, Fish and Iser.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Arylalkylcyclopropenethiones undergo highly regioselective photochemical a-cleavage via thioketene carbene intermediates, giving rise to products derived from the less stabilized carbene. UHF MIND0/3 calculations provide an insight into this unexpected regioselectivity. The nx* triplet of cyclopropenethione is calculated to have a highly unsymmetrical geometry with an elongated C-C bond, a delocalized thiaaUyl fragment, and a pyramidal radicaloid carbon (which eventually becomes the carbene center). From this molecular electronic structure, aryl group stabilization is expected to be more effective at the thiaallyl group rather than at the pyramidal radical center. Thus, the stability of the substituted triplet thione rather than that of the thioketene carbene determines the preferred regiochemistry of cleavage. The unusual structure of the cyclopropenethione triplet is suggested to be related to one of the Jahn-Teller distorted forms of the cyclopropenyl radical. An alternative symmetrical structure is adopted by the corresponding triplet of cyclopropenone, partly accounting for its differing photobehavior. A similar structural dichotomy is demonstrated for the corresponding radical anions as well.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Quantifying the potential spread and density of an invading organism enables decision-makers to determine the most appropriate response to incursions. We present two linked models that estimate the spread of Solenopsis invicta Buren (red imported fire ant) in Australia based on limited data gathered after its discovery in Brisbane in 2001. A stochastic cellular automaton determines spread within a location (100 km by 100 km) and this is coupled with a model that simulates human-mediated movement of S. invicta to new locations. In the absence of any control measures, the models predict that S. invicta could cover 763 000–4 066 000 km2 by the year 2035 and be found at 200 separate locations around Australia by 2017–2027, depending on the rate of spread. These estimated rates of expansion (assuming no control efforts were in place) are higher than those experienced in the USA in the 1940s during the early invasion phases in that country. Active control efforts and quarantine controls in the USA (including a concerted eradication attempt in the 1960s) may have slowed spread. Further, milder winters, the presence of the polygynous social form, increased trade and human mobility in Australia in 2000s compared with the USA in 1940s could contribute to faster range expansion.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper we develop compilation techniques for the realization of applications described in a High Level Language (HLL) onto a Runtime Reconfigurable Architecture. The compiler determines Hyper Operations (HyperOps) that are subgraphs of a data flow graph (of an application) and comprise elementary operations that have strong producer-consumer relationship. These HyperOps are hosted on computation structures that are provisioned on demand at runtime. We also report compiler optimizations that collectively reduce the overheads of data-driven computations in runtime reconfigurable architectures. On an average, HyperOps offer a 44% reduction in total execution time and a 18% reduction in management overheads as compared to using basic blocks as coarse grained operations. We show that HyperOps formed using our compiler are suitable to support data flow software pipelining.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Le naturalisme finlandais. Une conception entropique du quotidien. Finnish Naturalism. An Entropic Conception of Everyday Life. Nineteenth century naturalism was a strikingly international literary movement. After emerging in France in the 1870s, it spread all over Europe including young, small nations with a relatively recent literary tradition, such as Finland. This thesis surveys the role and influence of French naturalism on the Finnish literature of the 1880s and 1890s. On the basis of a selection of works of six Finnish authors (Juhani Aho, Minna Canth, Kauppis-Heikki, Teuvo Pakkala, Ina Lange and Karl August Tavaststjerna), the study establishes a view of the main features of Finnish naturalism in comparison with that of French authors, such as Zola, Maupassant and Flaubert. The study s methodological framework is genre theory: even though naturalist writers insisted on a transparent description of reality, naturalist texts are firmly rooted in general generic categories with definable relations and constants on which European novels impose variations. By means of two key concepts, entropy and everyday life , this thesis establishes the parameters of the naturalist genre. At the heart of the naturalist novel is a movement in the direction of disintegration and confusion, from order to disorder, from illusion to disillusion. This entropic vision is merged into the representation of everyday life, focusing on socially mediocre characters and discovering their miseries in all their banality and daily grayness. By using Mikhail Bakhtin s idea of literary genres as a means of understanding experience, this thesis suggests that everyday life is an ideological core of naturalist literature that determines not only its thematic but also generic distinctions: with relation to other genres, such as to Balzac s realism, naturalism appears primarily to be a banalization of everyday life. In idyllic genres, everyday life can be represented by means of sublimation, but a naturalist novel establishes a distressing, negative everyday life and thus strives to take a critical view of the modern society. Beside the central themes, the study surveys the generic blends in naturalism. The thesis analyzes how the coalition of naturalism and the melodramatic mode in the work of Minna Canth serves naturalisms ambition to discover the unconscious instincts underlying daily realities, and how the symbolic mode in the work of Juhani Aho duplicates the semantic level of the apparently insignificant, everyday naturalist details. The study compares the naturalist novel to the ideological novel (roman à these) and surveys the central dilemma of naturalism, the confrontation between the optimistic belief in social reform and the pessimistic theory of determinism. The thesis proposes that the naturalist novel s contribution to social reform lies in its shock effect. By means of representing the unpleasant truth the entropy of everyday life it aims to scandalize the reader and make him aware of the harsh realities that might apply also to him.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This study investigated questions related to half-occlusion processing in human stereoscopic vision: (1) How does the depth location of a half-occluding figure affect the depth localization of adjacent monocular objects? (2) Is three-dimensional slant around vertical axis (geometric effect) affected by half-occlusion constraints? and (3) How the half-occlusion constraints and surface formation processes are manifested in stereoscopic capture? Our results showed that the depth localization of binocular objects affects the depth localization of discrete monocular objects. We also showed that the visual system has a preference for a frontoparallel surface interpretation if the half-occlusion configuration allows multiple interpretation alternatives. When the surface formation was constrained by textures, our results showed that a process of rematching spreading determines the resulting perception and that the spreading can be limited by illusory contours that support the presence of binocularly unmatched figures. The unmatched figures could be present, if the inducing figures producing the illusory surface contained binocular image differences that provided cues for quantitative da Vinci stereopsis. These findings provide evidence of the significant role of half-occlusions in stereoscopic processing.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Distraction in the workplace is increasingly more common in the information age. Several tasks and sources of information compete for a worker's limited cognitive capacities in human-computer interaction (HCI). In some situations even very brief interruptions can have detrimental effects on memory. Nevertheless, in other situations where persons are continuously interrupted, virtually no interruption costs emerge. This dissertation attempts to reveal the mental conditions and causalities differentiating the two outcomes. The explanation, building on the theory of long-term working memory (LTWM; Ericsson and Kintsch, 1995), focuses on the active, skillful aspects of human cognition that enable the storage of task information beyond the temporary and unstable storage provided by short-term working memory (STWM). Its key postulate is called a retrieval structure an abstract, hierarchical knowledge representation built into long-term memory that can be utilized to encode, update, and retrieve products of cognitive processes carried out during skilled task performance. If certain criteria of practice and task processing are met, LTWM allows for the storage of large representations for long time periods, yet these representations can be accessed with the accuracy, reliability, and speed typical of STWM. The main thesis of the dissertation is that the ability to endure interruptions depends on the efficiency in which LTWM can be recruited for maintaing information. An observational study and a field experiment provide ecological evidence for this thesis. Mobile users were found to be able to carry out heavy interleaving and sequencing of tasks while interacting, and they exhibited several intricate time-sharing strategies to orchestrate interruptions in a way sensitive to both external and internal demands. Interruptions are inevitable, because they arise as natural consequences of the top-down and bottom-up control of multitasking. In this process the function of LTWM is to keep some representations ready for reactivation and others in a more passive state to prevent interference. The psychological reality of the main thesis received confirmatory evidence in a series of laboratory experiments. They indicate that after encoding into LTWM, task representations are safeguarded from interruptions, regardless of their intensity, complexity, or pacing. However, when LTWM cannot be deployed, the problems posed by interference in long-term memory and the limited capacity of the STWM surface. A major contribution of the dissertation is the analysis of when users must resort to poorer maintenance strategies, like temporal cues and STWM-based rehearsal. First, one experiment showed that task orientations can be associated with radically different patterns of retrieval cue encodings. Thus the nature of the processing of the interface determines which features will be available as retrieval cues and which must be maintained by other means. In another study it was demonstrated that if the speed of encoding into LTWM, a skill-dependent parameter, is slower than the processing speed allowed for by the task, interruption costs emerge. Contrary to the predictions of competing theories, these costs turned out to involve intrusions in addition to omissions. Finally, it was learned that in rapid visually oriented interaction, perceptual-procedural expectations guide task resumption, and neither STWM nor LTWM are utilized due to the fact that access is too slow. These findings imply a change in thinking about the design of interfaces. Several novel principles of design are presented, basing on the idea of supporting the deployment of LTWM in the main task.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Ramp metering (RM) is an access control for motorways, in which a traffic signal is placed at on-ramps to regulate the rate of vehicles entering the motorway and thus to preserve the motorway capacity. In general, RM algorithms fall into two categories by their effective scope: local control and coordinated control. Local control algorithm determines the metering rate based on the traffic condition on adjacent motorway mainline and the on-ramp. Conversely, coordinated RM strategies make use of measurements from the entire motorway network to operate individual ramp signals for optimal performance at the network level. This study proposes a multi-hierarchical strategy for on-ramp coordination. The strategy is structured in two layers. At the higher layer, a centralised, predictive controller plans the coordination control within a long update interval based on the location of high-risk breakdown flow. At the lower layer, reactive controllers determine the metering rates of those ramps involved in the ramp coordination with a short update interval. This strategy is modelled and applied to the northbound model of the Pacific Motorway in a micro-simulation platform (AIMSUN). The simulation results show that the proposed strategy effectively delays the onset of congestion and reduces total congestion with better managed on-ramp queues.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Education for a Technological Society. Public School Curriculum Construction, 1945-1952. The subject of my research is the significance of technology in the construction process of the public school curriculum during the years 1945-1952. During the period the war reparation and rebuilding placed demands and actions to rationalise and dramatise industry and agriculture. Thereby the ambitions of building a technological country and the reformation of curriculum took place simultaneously. Fordistian terms of reference, of which the principles were mass production, rationalisation and standardisation, a hierarchical division of labour and partition of assignments, provided a model for the developing curriculum. In the research the curriculum is examined as an artefact, which shapes socio-technically under the influence of social and technical factors. In the perspective of socio-technical construction the artefact is represented by the viewpoints of members of relevant social groups. The groups give meaning to the curriculum artefact, which determines the components of the curriculum. The weakness of the curriculum was its ineffectiveness, which was due to three critical problems. Firstly, the curriculum was to be based on scientific work, which meant the development of schools through experiments and scientific research. Secondly, the civilised conseption in the curriculum was to be composed of theoretical knowledge, as well as practical skills. Practical education was useful for both the individual and society. Thirdly, the curriculum was to be reformed in a way that the individuality of the pupil would be taken into account. It was useful for the society that talents and natural abilities of every pupil were observed and used to direct the pupil to the proper place in the social division of labour, according to the "right man in a right place" principle. The solutions to critical problems formed the instructions of the public school curriculum, which described the nature and content of education. Technology and its development were on essential part of the whole school curriculum process. The quality words connected to the development of technology - progress, rationality and effectiveness - were also suitable qualifiers and reasons for the reform of the curriculum. On the other hand, technology set a point of comparison and demand for the development of all phases of education. The view of technology was not clearly deterministic - it was also possible to shape technological society with the help of education. The public school curriculum process indicates how originally the principles of technological systems were shaped to the language of education and accepted in educational content.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The notion of being sure that you have completely eradicated an invasive species is fanciful because of imperfect detection and persistent seed banks. Eradication is commonly declared either on an ad hoc basis, on notions of seed bank longevity, or on setting arbitrary thresholds of 1% or 5% confidence that the species is not present. Rather than declaring eradication at some arbitrary level of confidence, we take an economic approach in which we stop looking when the expected costs outweigh the expected benefits. We develop theory that determines the number of years of absent surveys required to minimize the net expected cost. Given detection of a species is imperfect, the optimal stopping time is a trade-off between the cost of continued surveying and the cost of escape and damage if eradication is declared too soon. A simple rule of thumb compares well to the exact optimal solution using stochastic dynamic programming. Application of the approach to the eradication programme of Helenium amarum reveals that the actual stopping time was a precautionary one given the ranges for each parameter.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Haptices and haptemes: A case study of developmental process in touch-based communication of acquired deafblind people This research is the first systematic, longitudinal process and development description of communication using touch and body with an acquired deafblind person. The research consists of observational and analysed written and video materials mainly from two informants´ experiences during period of 14 years. The research describes the adaptation of Social-Haptic methods between a couple, and other informants´ experiences, which have been collated from biographies and through giving national and international courses. When the hearing and sight deteriorates due to having an acquired deafblind condition, communication consists of multi-systematic and adaptive methods. A person`s expressive language, spoken or Sign Language, usually remains unchanged, but the methods of receiving information could change many times during a person s lifetime. Haptices are made from haptemes that determines which regulations are analysed. When defining haptemes the definition, classification and varied meanings of touch were discovered. Haptices include sharing a personal body space, meaning of touch-contact, context and using different communication channels. Communication distances are classified as exact distance, estimated distance and touch distance. Physical distance can be termed as very long, long, medium or very close. Social body space includes the body areas involved in sending and receiving haptices and applying different types of contacts. One or two hands can produce messages by using different hand shapes and orientations. This research classifies how the body can be identified into different areas such as body orientation, varied body postures, body position levels, social actions and which side of the body is used. Spatial body space includes environmental and situational elements. Haptemes of movements are recognised as the direction of movements, change of directions on the body, directions between people, pressure, speed, frequency, size, length, duration, pause, change of rhythm, shape, macro and micro movements. Haptices share multidimensional meanings and emotions. Research describes haptices in different situations enhancing sensory information and functioning also as an independent language. Haptices includes social-haptic confirmation system, social quick messages, body drawing, contact to the people and the environment, guiding and sharing art experiences through movements. Five stages of emotional differentiation were identified as very light, light, medium, heavy and very heavy touch. Haptices give the possibility to share different art, hobby and game experiences. A new communication system development based on the analysis of the research data is classified into different phases. These are experimental initiation, social deconstruction, developing the description of Social-Haptic communication and generalisation of the theory as well as finding and conceptualising the haptices and haptemes. The use and description of haptices is a social innovation, which illustrates the adaptive function of the body and perceptual senses that can be taught to a third party. Keywords: deafblindness, hapteme, haptic, haptices, movement, social-haptic communication, social-haptic confirmation system, tactile, touch

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Kaposi's sarcoma herpesvirus (KSHV) is an oncogenic human virus and the causative agent of three human malignancies: Kaposi's sarcoma (KS), Multicentric Castleman's Disease (MCD), and primary effusion lymphoma (PEL). In tumors, KSHV establishes latent infection during which it produces no infectious particles. Latently infected cells can enter the lytic replication cycle, and upon provision of appropriate cellular signals, produce progeny virus. PEL, commonly described in patients with AIDS, represents a diffuse large-cell non-Hodgkin's lymphoma, with median survival time less than six months after diagnosis. As tumor suppressor gene TP53 mutations occur rarely in PEL, the aim of this thesis was to investigate whether non-genotoxic activation of the p53 pathway can eradicate malignant PEL cells. This thesis demonstrates that Nutlin-3, a small-molecule inhibitor of the p53-MDM2 interaction, efficiently restored p53 function in PEL cells, leading to cell cycle arrest and massive apoptosis. Furthermore, we found that KSHV infection activated DNA damage signaling, rendering the cells more sensitive to p53-dependent cell death. We also showed in vivo the therapeutic potential of p53 restoration that led to regression of subcutaneous and intraperitoneal PEL tumor xenografts without adversely affecting normal cells. Importantly, we demonstrated that in a small subset of intraperitoneal PEL tumors, spontaneous induction of viral reactivation dramatically impaired Nutlin-3-induced p53-mediated apoptosis. Accordingly, we found that elevated KSHV lytic transcripts correlated with PEL tumor burden in animals and that inhibition of viral reactivation in vitro restored cytotoxic activity of a small-molecule inhibitor of the p53-MDM2 interaction. Latency provides a unique opportunity for KSHV to escape host immune surveillance and to establish persistent infections. However, to maintain viral reservoirs and spread to other hosts, KSHV must be reactivated from latency and enter into the lytic growth phase. We showed that phosphorylation of nucleolar phosphoprotein nucleophosmin (NPM) by viral cyclin-CDK6 is critical for establishment and maintenance of the KSHV latency. In short, this study provides evidence that the switch between latent phase and lytic replication is a critical step that determines the outcome of viral infection and the pathogenesis of KSHV-induced malignancies. Our data may thus contribute to development of novel targeted therapies for intervention and treatment of KSHV-associated cancers.