986 resultados para fair value


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The sandstorms in 2001 were numerically simulated with NARCM that is a dust emission and transport model developed by Meteorological Service of Canada. In this paper, the dataset of NARCM model is processed and analyzed. The results of processing and analyzing show fair images about influence ranges and transport routes of sandstorms in 2001. The outcomes are compared with aerosol concentrations of atmosphere over Beijing, China and Tango, Japan. It confirms that sandstorm occurs when AK TK K and Si concentration in the air increases. It can be concluded that the NARCM model is appropriate for modeling sandstorm in North of China. The processing and analyzing show that the dust is produced and transported in the Otindag and Bashang. So the Otindag and Bashang are parts of source areas of sandstorms in East Asia. Another focus of this study is the REE of aeolian sediments in Otindag、Bashang、Tianmo Badain Jara、Hulunbeier and Kalahali, South Africa. The analysis on REE shows: There is clear distinction in HREE and LREE's Fractionation Degree (HLFD) between the deserts. HLFD is very high in Hulunbeier, with a value of (La/Lu)N 16.0. The value of (La/Lu)N is 12.7 inTianmo and 8.1 in Octindag. The HREE's Fractionation Degree(HFD) is about 4.0, quite similar in all samples. (3) The LREE's Fractionation Degree(LFD) varies slightly, from 1.5(Badain Jaran) to 2.3(Tianmo).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In non-western society,researches on social development and personality change focused on economic development and social modernization. The present study is aimed at exploring the relationship between the social transformation and personality changes of Chinese people by using so-called indigenous personality measurement of CPAI (Chinese Personality Assessment Inventory). Meanwhile, the influence of CPAI measurement itself and measurement theory were also taken into consideration. In study 1, two sets of CPAI data collected in a 10 year interval were analyzed. At the same time, the CPAI-2 data was analyzed in terms of modernization level of various cities from which the data were collected. However, this study didn’t consider the importance of “equivalence” of the measurement, CPAI. In study 2, we detected DIF (Differential item functioning) across the different period groups to confirm if CPAI was equal to people in different period. In this process, both CTT and IRT method were used. The outcome reminded us that there were some DIF items. In study 3, to make sure that the personality measurement is fair to people in different period, we only saved those items whose DIF effect size lower than 0.01, and used IRT method to estimate test-taker’s personality. Then, cohort analysis was used to explore the pattern of personality change of Chinese people. In study 4, we factor-analyzed the DIF items to find the relation between social transformation and the latent personality variable which were composed of DIF items. From these 4 studies, we could got the following conclusions: (1) The CPAI 22 traits could be divided into two categories, with the changing of age, period and cohort, type I traits didn’t change, they were Logical vs Affective Orientation, Enterprise, Responsibility, Inferiority vs Self-Acceptance, Optimism vs Pessimism, Face, Family, Defensive, Graciousness vs Meanness; While with the changing of age, period and cohort, type II traits changed, they were Leadership, Self vs. Social Orientation, Veraciousness vs Slickness, Traditionalism vs Modernity, Harmony, Renqing, Meticulousness, Extraversion vs Introversion, Emotionality, Practical Mindedness, Internal vs External Locus of Control, Thrift vs Extravagance, Discipline. Meanwhile DIF items measured 5 psychologycial characteristics which changed greatly with the changing of age, period and cohort, they were Life attitude of Cynicism-realism, Psychological maladjustment, Coping style of Waiyuanneifang, Self-efficacy, the value of Individualism. (2) In sum, Chinese people in 1992 were more traditional than those in 2001, and with the 10-year of rapid development, according to the market economy’s needs, Chinese people became more individualism. (3) The DIF method of CTT and IRT were comparable. But, in generally, IRT method was more accurate and valid in detecting DIF as were as estimating personality. (4) The DIF outcomes showed that CPAI had good item validity. Meanwhile, it’s possible to develop a subscale by using CPAI items to assess some psychological characteristics. In this current study, according to their stability and variability, we could divided personality traits and psychological characteristics into 3 categories, and the outcome supported the hypothesis of “Six Factor Model”, these foundings were of some theoretic meanings. Meanwhile, as the relation between social development and personality change being explored, it certain help Chinese people cope with the rapid changing society. In this study, we also found that it’s possible to develop a subscale by using CPAI items to assess obverse personality traits and it had some practical use. Furthermore, the use of different measurement theory and cohort analysis embodied some innovation in methodology.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This year, as the finale to the Artificial Intelligence Laboratory's annual Winter Olympics, the Lab staged an AI Fair ??night devoted to displaying the wide variety of talents and interests within the laboratory. The Fair provided an outlet for creativity and fun in a carnival-like atmosphere. Students organized events from robot boat races to face-recognition vision contests. Research groups came together to make posters and booths explaining their work. The robots rolled down out of the labs, networks were turned over to aerial combat computer games and walls were decorated with posters of zany ideas for the future. Everyone pitched in, and this photograph album is a pictorial account of the fun that night at the AI Fair.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objective: Thirteen urinary nucleosides, primarily degradation products of tRNA, were evaluated as potential tumor markers for breast cancer patients.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Traditionally, language speakers are categorised as mono-lingual, bilingual, or multilingual. It is traditionally assumed in English language education that the ‘lingual’ is something that can be ‘fixed’ in form, written down to be learnt, and taught. Accordingly, the ‘mono’-lingual will have a ‘fixed’ linguistic form. Such a ‘form’ differs according to a number of criteria or influences including region or ‘type’ of English (for example, World Englishes) but is nevertheless assumed to be a ‘form’. ‘Mono-lingualism’ is defined and believed, traditionally, to be ‘speaking one language’; wherever that language is; or whatever that language may be. In this chapter, grounded in an individual subjective philosophy of language, we question this traditional definition. Viewing language from the philosophical perspectives such as those of Bakhtin and Voloshinov, we argue that the prominence of ‘context’ and ‘consciousness’ in language means that to ‘fix’ the form of a language goes against the very spirit of how it is formed and used. We thus challenge the categorisation of ‘mono’-lingualism; proposing that such a categorisation is actually a category error, or a case ‘in which a property is ascribed to a thing that could not possibly have that property’ (Restivo, 2013, p. 175), in this case the property of ‘mono’. Using this proposition as a starting point, we suggest that more time be devoted to language in its context and as per its genuine use as a vehicle for consciousness. We theorise this can be done through a ‘literacy’ based approach which fronts the context of language use rather than the language itself. We outline how we envision this working for teachers, students and materials developers of English Language Education materials in a global setting. To do this we consider Scotland’s Curriculum for Excellence as an exemplar to promote conscious language use in context.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This is a draft 2 of a discussion paper written for Boston University Libraries

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A problem with Speculative Concurrency Control algorithms and other common concurrency control schemes using forward validation is that committing a transaction as soon as it finishes validating, may result in a value loss to the system. Haritsa showed that by making a lower priority transaction wait after it is validated, the number of transactions meeting their deadlines is increased, which may result in a higher value-added to the system. SCC-based protocols can benefit from the introduction of such delays by giving optimistic shadows with high value-added to the system more time to execute and commit instead of being aborted in favor of other validating transactions, whose value-added to the system is lower. In this paper we present and evaluate an extension to SCC algorithms that allows for commit deferments.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Attributing a dollar value to a keyword is an essential part of running any profitable search engine advertising campaign. When an advertiser has complete control over the interaction with and monetization of each user arriving on a given keyword, the value of that term can be accurately tracked. However, in many instances, the advertiser may monetize arrivals indirectly through one or more third parties. In such cases, it is typical for the third party to provide only coarse-grained reporting: rather than report each monetization event, users are aggregated into larger channels and the third party reports aggregate information such as total daily revenue for each channel. Examples of third parties that use channels include Amazon and Google AdSense. In such scenarios, the number of channels is generally much smaller than the number of keywords whose value per click (VPC) we wish to learn. However, the advertiser has flexibility as to how to assign keywords to channels over time. We introduce the channelization problem: how do we adaptively assign keywords to channels over the course of multiple days to quickly obtain accurate VPC estimates of all keywords? We relate this problem to classical results in weighing design, devise new adaptive algorithms for this problem, and quantify the performance of these algorithms experimentally. Our results demonstrate that adaptive weighing designs that exploit statistics of term frequency, variability in VPCs across keywords, and flexible channel assignments over time provide the best estimators of keyword VPCs.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper we present Statistical Rate Monotonic Scheduling (SRMS), a generalization of the classical RMS results of Liu and Layland that allows scheduling periodic tasks with highly variable execution times and statistical QoS requirements. Similar to RMS, SRMS has two components: a feasibility test and a scheduling algorithm. The feasibility test for SRMS ensures that using SRMS' scheduling algorithms, it is possible for a given periodic task set to share a given resource (e.g. a processor, communication medium, switching device, etc.) in such a way that such sharing does not result in the violation of any of the periodic tasks QoS constraints. The SRMS scheduling algorithm incorporates a number of unique features. First, it allows for fixed priority scheduling that keeps the tasks' value (or importance) independent of their periods. Second, it allows for job admission control, which allows the rejection of jobs that are not guaranteed to finish by their deadlines as soon as they are released, thus enabling the system to take necessary compensating actions. Also, admission control allows the preservation of resources since no time is spent on jobs that will miss their deadlines anyway. Third, SRMS integrates reservation-based and best-effort resource scheduling seamlessly. Reservation-based scheduling ensures the delivery of the minimal requested QoS; best-effort scheduling ensures that unused, reserved bandwidth is not wasted, but rather used to improve QoS further. Fourth, SRMS allows a system to deal gracefully with overload conditions by ensuring a fair deterioration in QoS across all tasks---as opposed to penalizing tasks with longer periods, for example. Finally, SRMS has the added advantage that its schedulability test is simple and its scheduling algorithm has a constant overhead in the sense that the complexity of the scheduler is not dependent on the number of the tasks in the system. We have evaluated SRMS against a number of alternative scheduling algorithms suggested in the literature (e.g. RMS and slack stealing), as well as refinements thereof, which we describe in this paper. Consistently throughout our experiments, SRMS provided the best performance. In addition, to evaluate the optimality of SRMS, we have compared it to an inefficient, yet optimal scheduler for task sets with harmonic periods.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The majority of the traffic (bytes) flowing over the Internet today have been attributed to the Transmission Control Protocol (TCP). This strong presence of TCP has recently spurred further investigations into its congestion avoidance mechanism and its effect on the performance of short and long data transfers. At the same time, the rising interest in enhancing Internet services while keeping the implementation cost low has led to several service-differentiation proposals. In such service-differentiation architectures, much of the complexity is placed only in access routers, which classify and mark packets from different flows. Core routers can then allocate enough resources to each class of packets so as to satisfy delivery requirements, such as predictable (consistent) and fair service. In this paper, we investigate the interaction among short and long TCP flows, and how TCP service can be improved by employing a low-cost service-differentiation scheme. Through control-theoretic arguments and extensive simulations, we show the utility of isolating TCP flows into two classes based on their lifetime/size, namely one class of short flows and another of long flows. With such class-based isolation, short and long TCP flows have separate service queues at routers. This protects each class of flows from the other as they possess different characteristics, such as burstiness of arrivals/departures and congestion/sending window dynamics. We show the benefits of isolation, in terms of better predictability and fairness, over traditional shared queueing systems with both tail-drop and Random-Early-Drop (RED) packet dropping policies. The proposed class-based isolation of TCP flows has several advantages: (1) the implementation cost is low since it only requires core routers to maintain per-class (rather than per-flow) state; (2) it promises to be an effective traffic engineering tool for improved predictability and fairness for both short and long TCP flows; and (3) stringent delay requirements of short interactive transfers can be met by increasing the amount of resources allocated to the class of short flows.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We present a procedure to infer a typing for an arbitrary λ-term M in an intersection-type system that translates into exactly the call-by-name (resp., call-by-value) evaluation of M. Our framework is the recently developed System E which augments intersection types with expansion variables. The inferred typing for M is obtained by setting up a unification problem involving both type variables and expansion variables, which we solve with a confluent rewrite system. The inference procedure is compositional in the sense that typings for different program components can be inferred in any order, and without knowledge of the definition of other program components. Using expansion variables lets us achieve a compositional inference procedure easily. Termination of the procedure is generally undecidable. The procedure terminates and returns a typing if the input M is normalizing according to call-by-name (resp., call-by-value). The inferred typing is exact in the sense that the exact call-by-name (resp., call-by-value) behaviour of M can be obtained by a (polynomial) transformation of the typing. The inferred typing is also principal in the sense that any other typing that translates the call-by-name (resp., call-by-value) evaluation of M can be obtained from the inferred typing for M using a substitution-based transformation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Animals are motivated to choose environmental options that can best satisfy current needs. To explain such choices, this paper introduces the MOTIVATOR (Matching Objects To Internal Values Triggers Option Revaluations) neural model. MOTIVATOR describes cognitiveemotional interactions between higher-order sensory cortices and an evaluative neuraxis composed of the hypothalamus, amygdala, and orbitofrontal cortex. Given a conditioned stimulus (CS), the model amygdala and lateral hypothalamus interact to calculate the expected current value of the subjective outcome that the CS predicts, constrained by the current state of deprivation or satiation. The amygdala relays the expected value information to orbitofrontal cells that receive inputs from anterior inferotemporal cells, and medial orbitofrontal cells that receive inputs from rhinal cortex. The activations of these orbitofrontal cells code the subjective values of objects. These values guide behavioral choices. The model basal ganglia detect errors in CS-specific predictions of the value and timing of rewards. Excitatory inputs from the pedunculopontine nucleus interact with timed inhibitory inputs from model striosomes in the ventral striatum to regulate dopamine burst and dip responses from cells in the substantia nigra pars compacta and ventral tegmental area. Learning in cortical and striatal regions is strongly modulated by dopamine. The model is used to address tasks that examine food-specific satiety, Pavlovian conditioning, reinforcer devaluation, and simultaneous visual discrimination. Model simulations successfully reproduce discharge dynamics of known cell types, including signals that predict saccadic reaction times and CS-dependent changes in systolic blood pressure.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The pervasive use of mobile technologies has provided new opportunities for organisations to achieve competitive advantage by using a value network of partners to create value for multiple users. The delivery of a mobile payment (m-payment) system is an example of a value network as it requires the collaboration of multiple partners from diverse industries, each bringing their own expertise, motivations and expectations. Consequently, managing partnerships has been identified as a core competence required by organisations to form viable partnerships in an m-payment value network and an important factor in determining the sustainability of an m-payment business model. However, there is evidence that organisations lack this competence which has been witnessed in the m-payment domain where it has been attributed as an influencing factor in a number of failed m-payment initiatives since 2000. In response to this organisational deficiency, this research project leverages the use of design thinking and visualisation tools to enhance communication and understanding between managers who are responsible for managing partnerships within the m-payment domain. By adopting a design science research approach, which is a problem solving paradigm, the research builds and evaluates a visualisation tool in the form of a Partnership Management Canvas. In doing so, this study demonstrates that when organisations encourage their managers to adopt design thinking, as a way to balance their analytical thinking and intuitive thinking, communication and understanding between the partners increases. This can lead to a shared understanding and a shared commitment between the partners. In addition, the research identifies a number of key business model design issues that need to be considered by researchers and practitioners when designing an m-payment business model. As an applied research project, the study makes valuable contributions to the knowledge base and to the practice of management.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This work illustrates the influence of wind forecast errors on system costs, wind curtailment and generator dispatch in a system with high wind penetration. Realistic wind forecasts of different specified accuracy levels are created using an auto-regressive moving average model and these are then used in the creation of day-ahead unit commitment schedules. The schedules are generated for a model of the 2020 Irish electricity system with 33% wind penetration using both stochastic and deterministic approaches. Improvements in wind forecast accuracy are demonstrated to deliver: (i) clear savings in total system costs for deterministic and, to a lesser extent, stochastic scheduling; (ii) a decrease in the level of wind curtailment, with close agreement between stochastic and deterministic scheduling; and (iii) a decrease in the dispatch of open cycle gas turbine generation, evident with deterministic, and to a lesser extent, with stochastic scheduling.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Creativity is often defined as developing something novel or new, that fits its context, and has value. To achieve this, the creative process itself has gained increasing attention as organizational leaders seek competitive advantages through developing new products, services, process, or business models. In this paper, we explore the notion of the creative process as including a series of “filters” or ways to process information as being a critical component of the creative process. We use the metaphor of coffee making and filters because many of our examples come from Vietnam, which is one of the world’s top coffee exporters and which has created a coffee culture rivaling many other countries. We begin with a brief review of the creative process its connection to information processing, propose a tentative framework for integrating the two ideas, and provide examples of how it might work. We close with implications for further practical and theoretical directions for this idea.