907 resultados para paraconsistent model theory


Relevância:

40.00% 40.00%

Publicador:

Resumo:

BACKGROUND Compliance with surgical checklist use remains an obstacle in the context of checklist implementation programs. The theory of planned behaviour was applied to analyse attitudes, perceived behaviour control, and norms as psychological antecedents of individuals' intentions to use the checklist. METHODS A cross-sectional survey study with staff (N = 866) of 10 Swiss hospitals was conducted in German and French. Group mean differences between individuals with and without managerial function were computed. Structural equation modelling and confirmatory factor analysis was applied to investigate the structural relation between attitudes, perceived behaviour control, norms, and intentions. RESULTS Significant mean differences in favour of individuals with managerial function emerged for norms, perceived behavioural control, and intentions, but not for attitudes. Attitudes and perceived behavioural control had a significant direct effect on intentions whereas norms had not. CONCLUSIONS Individuals with managerial function exhibit stronger perceived behavioural control, stronger norms, and stronger intentions. This could be applied in facilitating checklist implementation. The structural model of the theory of planned behaviour remains stable across groups, indicating a valid model to describe antecedents of intentions in the context of surgical checklist implementation.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

We propose a way to incorporate NTBs for the four workhorse models of the modern trade literature in computable general equilibrium models (CGEs). CGE models feature intermediate linkages and thus allow us to study global value chains (GVCs). We show that the Ethier-Krugman monopolistic competition model, the Melitz firm heterogeneity model and the Eaton and Kortum model can be defined as an Armington model with generalized marginal costs, generalized trade costs and a demand externality. As already known in the literature in both the Ethier-Krugman model and the Melitz model generalized marginal costs are a function of the amount of factor input bundles. In the Melitz model generalized marginal costs are also a function of the price of the factor input bundles. Lower factor prices raise the number of firms that can enter the market profitably (extensive margin), reducing generalized marginal costs of a representative firm. For the same reason the Melitz model features a demand externality: in a larger market more firms can enter. We implement the different models in a CGE setting with multiple sectors, intermediate linkages, non-homothetic preferences and detailed data on trade costs. We find the largest welfare effects from trade cost reductions in the Melitz model. We also employ the Melitz model to mimic changes in Non tariff Barriers (NTBs) with a fixed cost-character by analysing the effect of changes in fixed trade costs. While we work here with a model calibrated to the GTAP database, the methods developed can also be applied to CGE models based on the WIOD database.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The authors are from UPM and are relatively grouped, and all have intervened in different academic or real cases on the subject, at different times as being of different age. With precedent from E. Torroja and A. Páez in Madrid Spain Safety Probabilistic models for concrete about 1957, now in ICOSSAR conferences, author J.M. Antón involved since autumn 1967 for euro-steel construction in CECM produced a math model for independent load superposition reductions, and using it a load coefficient pattern for codes in Rome Feb. 1969, practically adopted for European constructions, giving in JCSS Lisbon Feb. 1974 suggestion of union for concrete-steel-al.. That model uses model for loads like Gumbel type I, for 50 years for one type of load, reduced to 1 year to be added to other independent loads, the sum set in Gumbel theories to 50 years return period, there are parallel models. A complete reliability system was produced, including non linear effects as from buckling, phenomena considered somehow in actual Construction Eurocodes produced from Model Codes. The system was considered by author in CEB in presence of Hydraulic effects from rivers, floods, sea, in reference with actual practice. When redacting a Road Drainage Norm in MOPU Spain an optimization model was realized by authors giving a way to determine the figure of Return Period, 10 to 50 years, for the cases of hydraulic flows to be considered in road drainage. Satisfactory examples were a stream in SE of Spain with Gumbel Type I model and a paper of Ven Te Chow with Mississippi in Keokuk using Gumbel type II, and the model can be modernized with more varied extreme laws. In fact in the MOPU drainage norm the redacting commission acted also as expert to set a table of return periods for elements of road drainage, in fact as a multi-criteria complex decision system. These precedent ideas were used e.g. in wide Codes, indicated in symposia or meetings, but not published in journals in English, and a condensate of contributions of authors is presented. The authors are somehow involved in optimization for hydraulic and agro planning, and give modest hints of intended applications in presence of agro and environment planning as a selection of the criteria and utility functions involved in bayesian, multi-criteria or mixed decision systems. Modest consideration is made of changing in climate, and on the production and commercial systems, and on others as social and financial.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

A molecular model of poorly understood hydrophobic effects is heuristically developed using the methods of information theory. Because primitive hydrophobic effects can be tied to the probability of observing a molecular-sized cavity in the solvent, the probability distribution of the number of solvent centers in a cavity volume is modeled on the basis of the two moments available from the density and radial distribution of oxygen atoms in liquid water. The modeled distribution then yields the probability that no solvent centers are found in the cavity volume. This model is shown to account quantitatively for the central hydrophobic phenomena of cavity formation and association of inert gas solutes. The connection of information theory to statistical thermodynamics provides a basis for clarification of hydrophobic effects. The simplicity and flexibility of the approach suggest that it should permit applications to conformational equilibria of nonpolar solutes and hydrophobic residues in biopolymers.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Poor hygienic practices and illness of restaurant employees are major contributors to the contamination of food and the occurrence of food-borne illness in the United States, costing the food industry and society billions of dollars each year. Risk factors associated with this problem include lack of proper handwashing; food handlers reporting to work sick; poor personal hygiene; and bare hand contact with ready-to-eat foods. However, traditional efforts to control these causes of food-borne illness by public health authorities have had limited impact, and have revealed the need for comprehensive and innovative programs that provide active managerial control over employee health and hygiene in restaurant establishments. Further, the introduction and eventual adoption by the food industry of such programs can be facilitated through the use of behavior-change theory. This Capstone Project develops a model program to assist restaurant owners and operators in exerting active control over health and hygiene in their establishments and provides theory-based recommendations for the introduction of the program to the food industry.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

O crescimento económico de um país depende dos fundos disponíveis, quer para o financiamento da formação de capital quer para a sua reposição. Fundos obtidos pelas empresas e organismos públicos através de diversas formas, entre as quais, se destaca a emissão de valores mobiliários. Os aforradores, detentores de recursos, ao comprarem valores mobiliários, aliam uma alta rentabilidade a uma elevada liquidez na remuneração dos seus investimentos. As bolsas de valores são o meio onde as empresas, os organismos públicos e os aforradores, têm a possibilidade de verem esses interesses conciliados de uma maneira eficiente, eficaz e transparente, garantindo assim, uma maior liquidez aos títulos financeiros transaccionados em bolsa de valores. As empresas possuem várias alternativas de financiamento, o mercado de capitais é considerado pelos gestores, a fonte onde o rendimento ou o retorno obtido pode ser maior, perante a contrapartida de se incorrer um maior risco. Este mercado surge como alternativa aos empréstimos bancários, as empresas podem, dessa forma, adquirir financiamento de terceiros, os quais se tornarão accionistas dessa empresa. Podem emitir novas acções no mercado accionista de forma a atrair investidores externos que garantam a sustentabilidade do negócio. As acções possuem diversas caracteristicas e modalidades e fazem com que o capital da empresa seja partilhado por todos os seus accionistas, tendo em conta a proporção por eles adquirida individualmente. Esta dissertação investiga a dinâmica de compra e de venda das acções no mercado bolsista, os factores que determinam o seu preço, assim como os modelos que permitem a avaliação das mesmas e a inferência da taxa de retorno esperada por um investidor. A avaliação das acções é o tema de maior importância para esta análise, mais concretamente, a determinação e previsão do preço, e não apenas o preço propriamente dito, pois este é facultado diáriamente por vários jornais e também na internet. Perante o estudo da determinação de preços de uma acção num horizonte temporal, um investidor pode inferir se as suas acções estão a ser avaliadas sob um preço justo, e mais importante, pode apurar a sua previsão consoante dados e análises de factores. Outro ponto importante abordado nesta investigação tem que ver com a possibilidade das empresas em conhecer o modo como o mercado faz a sua avaliação a fim de tomar decisões certas acerca do orçamento de capital. Apenas se deve julgar a atractividade de um negócio se se souber como são avaliadas as acções. Nos mercados financeiros existe a tendência, por parte dos agentes económicos, de relacionarem o preço com o valor dos títulos financeiros. As decisões para a transacção de títulos financeiros são tomadas segundo a sua comparação. O preço ou cotação de mercado é formado em mercados organizados, pelo que depende das regras de funcionamento do mercado, tais como, os mínimos para transacção ou a variação máxima e mínima permitida. Estão associados a uma transação dependendo assim da procura e oferta dos títulos e incorporam ainda os custos de transacção. A ideia subjacente ao modelo Capital Asset Pricing Model é a de que, os investidores esperam uma recompensa pela preocupação dos investimentos realizados com risco ou com um retorno incerto. Quando se investe em títulos com risco, espera-se um retorno extra (comparando com os Bilhetes do Tesouro sem risco, recebe-se apenas os juros) ou um prémio de risco pela preocupação. A incerteza no retorno dos títulos provem de duas fontes, nomeadamente os factores macroeconómicos, pode-se chamar também, um factor comum, e os factores específicos inerentes à actividade da empresa. O facto comum é assumido como tendo um valor esperado zero pois é medido por nova informação respeitante à macroeconomia. O modelo assume duas ideias fundamentais: em primeiro lugar, existe consenso em relação ao facto dos investidores exigirem um retorno adicional por correrem riscos, e em segundo lugar, os investidores preocupam-se geralmente com o risco de mercado geral que não pode ser eliminado através de diversificação da carteira de investimentos. Este modelo pode ser bastante eficaz pois apenas considera um único factor para o cálculo da rendibilidade esperada de um título financeiro, que é a volatilidade do mercado no geral, a qual pode ser estudada. Ao contrário dos modelos multifactoriais, que incluem vários factores macroeconómicos tornando o objectivo da análise pouco intuitivo e complexo. Existem vários modelos para avaliação dos títulos de uma empresa cotada em bolsa de valores, geralmente estes modelos utilizam taxas de juro sem risco para equilibrar carteiras diversificadas, embora seja necessário analisar o retorno de um título ou carteira sob a influência de diversas variáveis macroeconómicas. Por exemplo outro modelo aplicado neste dissertação é o modelo Arbitrage Pricing Theory que perante o seu resultado comparado com o primeiro modelo, se pode definir qual dos modelos tem uma aplicação mais conclusiva para o mercado accionista português. Este modelo de avaliação por arbitragem estabelece uma relação linear entre o excedente do retorno esperado dos activos face à taxa de juro certa (sem risco) e um conjunto de variáveis. Pressupõe que a taxa de rentabilidade de um activo com risco é uma função linear de um conjunto de factores comuns a todos os activos financeiros. Tem como ideia subjacente, a constituição de uma carteira de não arbitragem, ou seja, uma carteira que não envolve qualquer risco (sistemático ou específico) e não requer investimento inicial pois a venda de certos activos gera fundos para adquirir novos. A metodologia implementada abrange o mercado financeiro e modelos possíveis para esta questão. Para responder às hipóteses de investigação efectuou-se a aplicação efectiva do modelo CAPM e do modelo APT, com a captação de dados oficiais em instituições financeiras e na Bolsa de Valores NYSE Euronext Lisbon. Considerou-se um período temporal num passado próximo de forma a poder obter-se conclusões concretas e indicadores precisos para a sua implementação e desenvolvimento no futuro. A principal conclusão desta dissertação relaciona-se com o facto de não se verificar a total validação da aplicação dos modelos, contudo o modelo CAPM é mais conclusivo do que o modelo APT, pois pode-se afirmar que tem aplicação prática em empresas que se conheça à priori a sua capitalização bolsista e beta anual. Por exemplo, aos títulos financeiros de empresas com capitalizações bolsistas inferiores a cinco mil milhões de euros e com um beta anual inferior a 1 poderá aplicar-se o modelo, assim como a títulos de empresas com capitalizações bolsistas superiores a dez mil milhões de euros e beta anual superior 1,5. Os sectores da Banca e do Papel também poderão ter potencial de aplicação do modelo.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Mineral processing plants use two main processes; these are comminution and separation. The objective of the comminution process is to break complex particles consisting of numerous minerals into smaller simpler particles where individual particles consist primarily of only one mineral. The process in which the mineral composition distribution in particles changes due to breakage is called 'liberation'. The purpose of separation is to separate particles consisting of valuable mineral from those containing nonvaluable mineral. The energy required to break particles to fine sizes is expensive, and therefore the mineral processing engineer must design the circuit so that the breakage of liberated particles is reduced in favour of breaking composite particles. In order to effectively optimize a circuit through simulation it is necessary to predict how the mineral composition distributions change due to comminution. Such a model is called a 'liberation model for comminution'. It was generally considered that such a model should incorporate information about the ore, such as the texture. However, the relationship between the feed and product particles can be estimated using a probability method, with the probability being defined as the probability that a feed particle of a particular composition and size will form a particular product particle of a particular size and composition. The model is based on maximizing the entropy of the probability subject to mass constraints and composition constraint. Not only does this methodology allow a liberation model to be developed for binary particles, but also for particles consisting of many minerals. Results from applying the model to real plant ore are presented. A laboratory ball mill was used to break particles. The results from this experiment were used to estimate the kernel which represents the relationship between parent and progeny particles. A second feed, consisting primarily of heavy particles subsampled from the main ore was then ground through the same mill. The results from the first experiment were used to predict the product of the second experiment. The agreement between the predicted results and the actual results are very good. It is therefore recommended that more extensive validation is needed to fully evaluate the substance of the method. (C) 2003 Elsevier Ltd. All rights reserved.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In the absence of an external frame of reference-i.e., in background independent theories such as general relativity-physical degrees of freedom must describe relations between systems. Using a simple model, we investigate how such a relational quantum theory naturally arises by promoting reference systems to the status of dynamical entities. Our goal is twofold. First, we demonstrate using elementary quantum theory how any quantum mechanical experiment admits a purely relational description at a fundamental. Second, we describe how the original non-relational theory approximately emerges from the fully relational theory when reference systems become semi-classical. Our technique is motivated by a Bayesian approach to quantum mechanics, and relies on the noiseless subsystem method of quantum information science used to protect quantum states against undesired noise. The relational theory naturally predicts a fundamental decoherence mechanism, so an arrow of time emerges from a time-symmetric theory. Moreover, our model circumvents the problem of the collapse of the wave packet as the probability interpretation is only ever applied to diagonal density operators. Finally, the physical states of the relational theory can be described in terms of spin networks introduced by Penrose as a combinatorial description of geometry, and widely studied in the loop formulation of quantum gravity. Thus, our simple bottom-up approach (starting from the semiclassical limit to derive the fully relational quantum theory) may offer interesting insights on the low energy limit of quantum gravity.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Recent world events aside, downward trends in donating behaviour in Australia have increased the need for research into the factors that inhibit and encourage charitable giving. A revised Theory of Planned Behaviour (TPB) model was used to determine the influence of attitudes, norms (injunctive, descriptive, and moral norms), perceived behavioural control (PBC), and past behaviour (PB) on intentions to donate money to charities and community service organisations. Respondents (N=186) completed a questionnaire assessing the constructs of the revised TPB model. Four weeks later, self-reported donating behaviour was assessed (n=65). Results showed support for the revised TPB model. Attitudes, PBC, injunctive norms, moral norms, and PB all predicted donating intentions. Descriptive norms did not predict intentions. Intention was the only significant predictor of selfreported behaviour four weeks later, with neither PBC nor PB having a direct effect on behaviour. Theoretical and applied implications of the results are discussed.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Contrast sensitivity improves with the area of a sine-wave grating, but why? Here we assess this phenomenon against contemporary models involving spatial summation, probability summation, uncertainty, and stochastic noise. Using a two-interval forced-choice procedure we measured contrast sensitivity for circular patches of sine-wave gratings with various diameters that were blocked or interleaved across trials to produce low and high extrinsic uncertainty, respectively. Summation curves were steep initially, becoming shallower thereafter. For the smaller stimuli, sensitivity was slightly worse for the interleaved design than for the blocked design. Neither area nor blocking affected the slope of the psychometric function. We derived model predictions for noisy mechanisms and extrinsic uncertainty that was either low or high. The contrast transducer was either linear (c1.0) or nonlinear (c2.0), and pooling was either linear or a MAX operation. There was either no intrinsic uncertainty, or it was fixed or proportional to stimulus size. Of these 10 canonical models, only the nonlinear transducer with linear pooling (the noisy energy model) described the main forms of the data for both experimental designs. We also show how a cross-correlator can be modified to fit our results and provide a contemporary presentation of the relation between summation and the slope of the psychometric function.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

A vállalkozási tevékenység a munkahelyteremtés és a gazdasági növekedés egyik döntő tényezője. Ennek a jövőbeli kedvező alakulását a fiatalok mai attitűdjei határozzák meg. Ha be tudjuk azonosítani a legfontosabb tényezőket, amelyek a fiatalok vállalkozásindítási szándékát befolyásolják, el tudjuk dönteni azt is, mely területen lehet és érdemes beavatkozni annak érdekében, hogy minél több új, életképes vállalkozás szülessen. A cikk a GUESSS kutatási projekt magyarországi adatbázisának köszönhetően közel 6000 hallgató válaszait elemezve rendszerezi a felsőoktatásban résztvevők vállalkozásindítási szándékára ható legfontosabb tényezőcsoportokat. Első lépésben Ajzen tervezett magatartás elméletének alkalmazásával vizsgálja a vállalkozásindítási szándék alakítóit, majd további tényezők, így a felsőoktatási intézmények által nyújtott szolgáltatások, a családi háttér és a demográfiai jellemzők bevonásával igyekszik minél pontosabban leírni a szándék alakulását. _____ Entrepreneurial activity is a decisive factor in the dynamics of job creation and economic growth. The future level of this activity highly depends on the attitudes of today’s youth towards this. If the most important factors influencing attitudes are identified and the entrepreneurial intentions towards entrepreneurship are determined, the fields of intervention targeting the creation of as many new and viable enterprises as possible can be defined. This article aims to systematise the most important factor groups that influence the decisions of students studying in higher education in terms of start-up activities and is based on the Hungarian database of the GUESSS research project, containing almost 6000 respondents. Firstly, it tests Ajzen’s Theory of Planned Behavior. Then such factors as supportive services provided by higher education institutions, family background and demographic factors are analysed in order to improve the explanatory power of the model.