921 resultados para Univalent Functions with Negative Coefficients
Resumo:
Undergraduate psycholog)' students from stepfamilies (always one step and one biological parent) and biologically intact families (always both biological parents) participated in this study. The goal was to assess perceptions of stepfamilies (N = 106, Nstepfamilies = 44, Nbiological = 62, age range = 17.17 to 28.92 years, M = 19.46 years). One theoretical perspective, the social stigma h)'pothesis, argues that there is a stigma attached to stepfamilies, or that stepfamilies are consistentiy associated with negative stereotypes. In the current study, participants were assessed on a number of variables, including a semantic differential scale, a perceived conflict scale and a perceived general satisfaction scale. It was found that a consistently negative view of stepfamilies was prevalent. Furthermore, the negative stereotypes existed, irrespective of participant family type. Results support the theoretical view that stepfamilies are stereotypically viewed as negative, when compared to biological families.
Resumo:
Este estudo teve por objetivo verificar a capacidade de otimismo, de suporte social e de valores do trabalho serem preditores de bem estar subjetivo, bem como analisar as relações de variáveis demográficas com essas variáveis de estudo, descrevê-las e examinar as relações entre elas. A amostra consistiu de 47 homens e de 101 mulheres com idade média de 41,00 anos (DP =10,72) que buscavam apoio em instituição para sua transição profissional. O instrumento de coleta de dados foi um questionário de autopreenchimento composto por cinco medidas que aferiram as variáveis incluídas no estudo: otimismo, percepção de suporte social, valores do trabalho, satisfação geral com a vida e afetos positivos e negativos, bem como variáveis demográficas: sexo, idade, estudo, trabalho, voluntariado, estado civil e permanência na instituição. Foram realizadas análises estatísticas descritivas, testadas diferenças entre médias, correlações, análise de variância e calculados modelos de regressão linear múltipla. As relações das variáveis de estudo com variáveis demográficas revelaram que as pessoas que não estudam percebem ter mais suporte prático e dão mais importância a motivações de autopromoção e de prestígio do que as que estudam. Os mais jovens com até 30 anos relataram que se percebem tendo mais apoio emocional e prático do que os mais velhos. Com o avanço da idade diminuem as percepções de suporte emocional e prático, contudo as pessoas com mais de 50 anos revelaram menos afetos negativos e maior satisfação com a vida do que os mais jovens. Casados revelaram dar menos importância do que separados, divorciados, viúvos, etc. à estabilidade no trabalho e segurança financeira; solteiros revelaram ter mais afetos negativos do que os casados. Homens relataram se sentir mais satisfeitos com a vida, ter mais afetos positivos e menos afetos negativos que mulheres. Quem realiza trabalho voluntário revelou ser mais otimista e ter menos afetos negativos do que aqueles que não realizam. Os dados revelaram que os pesquisados têm um bom nível de otimismo e uma percepção de suporte emocional maior do que a percepção de suporte prático; são motivados, principalmente por metas de realização no trabalho e de estabilidade e segurança financeira; sentem-se indiferentes quanto à satisfação com a vida; apresentam afetos positivos um pouco acima da indiferença; contudo sentem poucos afetos negativos. Disso decorre que um pouco mais de dois terços dos pesquisados apresentaram predominância de estados emocionais positivos sobre os negativos. O otimismo foi a variável que estabeleceu associações mais altas e em maior quantidade; correlacionou positivamente com valores de realização no trabalho, com valores de relações sociais, com valores do trabalho de prestígio, com satisfação com a vida e com afetos positivos; e correlacionou negativamente com afetos negativos. A percepção de suporte emocional correlacionou positivamente com valores de prestígio, afetos positivos e com satisfação com a vida; e correlacionou negativamente com afetos negativos. Percepção de suporte prático não apresentou correlações significativas com nenhuma variável de estudo. Afetos positivos correlacionaram-se positivamente com valores do trabalho de relações sociais e com valores do trabalho de prestígio. A partir da análise de três modelos preditivos encontrou-se que otimismo e suporte emocional repercutem positivamente sobre a satisfação com a vida e sobre afetos positivos. Otimismo repercute negativamente sobre afetos negativos. Valores do trabalho de prestígio repercutem positivamente sobre afetos positivos. Valores de estabilidade repercutem negativamente sobre satisfação com a vida e sobre afetos positivos; e positivamente sobre afetos negativos. Os resultados deste estudo mostraram que o estado otimista é um poderoso fator de impacto positivo sobre o estado de saúde denominado bem estar subjetivo.(AU)
Resumo:
As ações de maior liquidez do índice IBOVESPA, refletem o comportamento das ações de um modo geral, bem como a relação das variáveis macroeconômicas em seu comportamento e estão entre as mais negociadas no mercado de capitais brasileiro. Desta forma, pode-se entender que há reflexos de fatores que impactam as empresas de maior liquidez que definem o comportamento das variáveis macroeconômicas e que o inverso também é uma verdade, oscilações nos fatores macroeconômicos também afetam as ações de maior liquidez, como IPCA, PIB, SELIC e Taxa de Câmbio. O estudo propõe uma análise da relação existente entre variáveis macroeconômicas e o comportamento das ações de maior liquidez do índice IBOVESPA, corroborando com estudos que buscam entender a influência de fatores macroeconômicos sobre o preço de ações e contribuindo empiricamente com a formação de portfólios de investimento. O trabalho abrangeu o período de 2008 a 2014. Os resultados concluíram que a formação de carteiras, visando a proteção do capital investido, deve conter ativos com correlação negativa em relação às variáveis estudadas, o que torna possível a composição de uma carteira com risco reduzido.
Resumo:
We introduce models of heterogeneous systems with finite connectivity defined on random graphs to capture finite-coordination effects on the low-temperature behaviour of finite-dimensional systems. Our models use a description in terms of small deviations of particle coordinates from a set of reference positions, particularly appropriate for the description of low-temperature phenomena. A Born-von Karman-type expansion with random coefficients is used to model effects of frozen heterogeneities. The key quantity appearing in the theoretical description is a full distribution of effective single-site potentials which needs to be determined self-consistently. If microscopic interactions are harmonic, the effective single-site potentials turn out to be harmonic as well, and the distribution of these single-site potentials is equivalent to a distribution of localization lengths used earlier in the description of chemical gels. For structural glasses characterized by frustration and anharmonicities in the microscopic interactions, the distribution of single-site potentials involves anharmonicities of all orders, and both single-well and double-well potentials are observed, the latter with a broad spectrum of barrier heights. The appearance of glassy phases at low temperatures is marked by the appearance of asymmetries in the distribution of single-site potentials, as previously observed for fully connected systems. Double-well potentials with a broad spectrum of barrier heights and asymmetries would give rise to the well-known universal glassy low-temperature anomalies when quantum effects are taken into account. © 2007 IOP Publishing Ltd.
Resumo:
The introduction of agent technology raises several security issues that are beyond conventional security mechanisms capability and considerations, but research in protecting the agent from malicious host attack is evolving. This research proposes two approaches to protecting an agent from being attacked by a malicious host. The first approach consists of an obfuscation algorithm that is able to protect the confidentiality of an agent and make it more difficult for a malicious host to spy on the agent. The algorithm uses multiple polynomial functions with multiple random inputs to convert an agent's critical data to a value that is meaningless to the malicious host. The effectiveness of the obfuscation algorithm is enhanced by addition of noise code. The second approach consists of a mechanism that is able to protect the integrity of the agent using state information, recorded during the agent execution process in a remote host environment, to detect a manipulation attack by a malicious host. Both approaches are implemented using a master-slave agent architecture that operates on a distributed migration pattern. Two sets of experimental test were conducted. The first set of experiments measures the migration and migration+computation overheads of the itinerary and distributed migration patterns. The second set of experiments is used to measure the security overhead of the proposed approaches. The protection of the agent is assessed by analysis of its effectiveness under known attacks. Finally, an agent-based application, known as Secure Flight Finder Agent-based System (SecureFAS) is developed, in order to prove the function of the proposed approaches.
Resumo:
The premise of this thesis is that Western thought is characterised by the need to enforce binary classification in order to structure the world. Classifications of sexuality and gender both embody this tendency, which has been largely influenced by Judeo-Christian tradition. Thus, it is argued that attitudes to sexuality, particularly homosexuality are, in part, a function of the way in which we seek to impose structure on the world. From this view, it is (partly) the ambiguity, inherent in gender and sexual variation, which evokes negative responses. The thesis presents a series of inter-linked studies examining attitudes to various aspects of human sexuality, including the human body, non-procreative sex acts (anal an oral sex) and patterns of sexuality that depart from the hetero-homo dichotomy. The findings support the view that attitudes to sexuality are significantly informed by gender-role stereotypes, with negative attitudes linked to intolerance of ambiguity. Male participants show large differences in their evaluations of male and female bodies, and of male and female sexual actors, than do female participants. Male participants also show a greater negativity to gay male sexual activity than do female participants, but males perceive lesbian sexuality similarly to heterosexuality. Male bodies are rated as being less 'permeable' than female bodies and male actors are more frequently identified as being the instigators of sexual acts. Crucial to the concept of heterosexism is the assumption that 'femininity' is considered inherently inferior to 'masculinity'. Hence, the findings provide an empirical basis for making connections between heterosexism and sexism, and therefore between the psychology of women, and gay and lesbian psychology.
Resumo:
This Thesis reports on the principles and usefulness of Performance Rating as developed by the writer over a number of years. In Part one a brief analysis is made of the Quality scene and its development up to the present. The need is exposed for Performance Rating as a tool for all areas of management*. At the same time a system of Quality Control is described which the writer has further developed under the title of 'Operator Control'. This system is based on the integration of all Quality control functions with the creative functions required for Quality achievement. The discussions are mainly focussed on the general philosophy of Quality, its creation and control and that part of Operator Control which affects Performance Rating. Whereas it is shown that the combination of Operator Control and Performance Rating is both economically and technically advantageous, Performance Rating can also usefully be applied under inspection control conditions. Part two describes the principles of Area Performance Rating. *The need for, and the advantages of, Performance Rating are particularly demonstrated in Case study No.1. From this a summation expression is derived which gives the key for grouping of areas with similar Performance Rating (P). A model is devised on which the theory is demonstrated. Relevant case studies, carried out in practice in factories are quoted in Part two, Chapter 4, one written by the Quality manager of that particular factory. Particular stress is laid in the final conclusions on management's function in the Quality field and how greatly this function is eased and improved through the introduction of Area Performance Rating.
Resumo:
This thesis addresses the kineto-elastodynamic analysis of a four-bar mechanism running at high-speed where all links are assumed to be flexible. First, the mechanism, at static configurations, is considered as structure. Two methods are used to model the system, namely the finite element method (FEM) and the dynamic stiffness method. The natural frequencies and mode shapes at different positions from both methods are calculated and compared. The FEM is used to model the mechanism running at high-speed. The governing equations of motion are derived using Hamilton's principle. The equations obtained are a set of stiff ordinary differential equations with periodic coefficients. A model is developed whereby the FEM and the dynamic stiffness method are used conjointly to provide high-precision results with only one element per link. The principal concern of the mechanism designer is the behaviour of the mechanism at steady-state. Few algorithms have been developed to deliver the steady-state solution without resorting to costly time marching simulation. In this study two algorithms are developed to overcome the limitations of the existing algorithms. The superiority of the new algorithms is demonstrated. The notion of critical speeds is clarified and a distinction is drawn between "critical speeds", where stresses are at a local maximum, and "unstable bands" where the mechanism deflections will grow boundlessly. Floquet theory is used to assess the stability of the system. A simple method to locate the critical speeds is derived. It is shown that the critical speeds of the mechanism coincide with the local maxima of the eigenvalues of the transition matrix with respect to the rotational speed of the mechanism.
Resumo:
Adaptability for distributed object-oriented enterprise frameworks is a critical mission for system evolution. Today, building adaptive services is a complex task due to lack of adequate framework support in the distributed computing environment. In this thesis, we propose a Meta Level Component-Based Framework (MELC) which uses distributed computing design patterns as components to develop an adaptable pattern-oriented framework for distributed computing applications. We describe our novel approach of combining a meta architecture with a pattern-oriented framework, resulting in an adaptable framework which provides a mechanism to facilitate system evolution. The critical nature of distributed technologies requires frameworks to be adaptable. Our framework employs a meta architecture. It supports dynamic adaptation of feasible design decisions in the framework design space by specifying and coordinating meta-objects that represent various aspects within the distributed environment. The meta architecture in MELC framework can provide the adaptability for system evolution. This approach resolves the problem of dynamic adaptation in the framework, which is encountered in most distributed applications. The concept of using a meta architecture to produce an adaptable pattern-oriented framework for distributed computing applications is new and has not previously been explored in research. As the framework is adaptable, the proposed architecture of the pattern-oriented framework has the abilities to dynamically adapt new design patterns to address technical system issues in the domain of distributed computing and they can be woven together to shape the framework in future. We show how MELC can be used effectively to enable dynamic component integration and to separate system functionality from business functionality. We demonstrate how MELC provides an adaptable and dynamic run time environment using our system configuration and management utility. We also highlight how MELC will impose significant adaptability in system evolution through a prototype E-Bookshop application to assemble its business functions with distributed computing components at the meta level in MELC architecture. Our performance tests show that MELC does not entail prohibitive performance tradeoffs. The work to develop the MELC framework for distributed computing applications has emerged as a promising way to meet current and future challenges in the distributed environment.
Resumo:
Financial institutes are an integral part of any modern economy. In the 1970s and 1980s, Gulf Cooperation Council (GCC) countries made significant progress in financial deepening and in building a modern financial infrastructure. This study aims to evaluate the performance (efficiency) of financial institutes (banking sector) in GCC countries. Since, the selected variables include negative data for some banks and positive for others, and the available evaluation methods are not helpful in this case, so we developed a Semi Oriented Radial Model to perform this evaluation. Furthermore, since the SORM evaluation result provides a limited information for any decision maker (bankers, investors, etc...), we proposed a second stage analysis using classification and regression (C&R) method to get further results combining SORM results with other environmental data (Financial, economical and political) to set rules for the efficient banks, hence, the results will be useful for bankers in order to improve their bank performance and to the investors, maximize their returns. Mainly there are two approaches to evaluate the performance of Decision Making Units (DMUs), under each of them there are different methods with different assumptions. Parametric approach is based on the econometric regression theory and nonparametric approach is based on a mathematical linear programming theory. Under the nonparametric approaches, there are two methods: Data Envelopment Analysis (DEA) and Free Disposal Hull (FDH). While there are three methods under the parametric approach: Stochastic Frontier Analysis (SFA); Thick Frontier Analysis (TFA) and Distribution-Free Analysis (DFA). The result shows that DEA and SFA are the most applicable methods in banking sector, but DEA is seem to be most popular between researchers. However DEA as SFA still facing many challenges, one of these challenges is how to deal with negative data, since it requires the assumption that all the input and output values are non-negative, while in many applications negative outputs could appear e.g. losses in contrast with profit. Although there are few developed Models under DEA to deal with negative data but we believe that each of them has it is own limitations, therefore we developed a Semi-Oriented-Radial-Model (SORM) that could handle the negativity issue in DEA. The application result using SORM shows that the overall performance of GCC banking is relatively high (85.6%). Although, the efficiency score is fluctuated over the study period (1998-2007) due to the second Gulf War and to the international financial crisis, but still higher than the efficiency score of their counterpart in other countries. Banks operating in Saudi Arabia seem to be the highest efficient banks followed by UAE, Omani and Bahraini banks, while banks operating in Qatar and Kuwait seem to be the lowest efficient banks; this is because these two countries are the most affected country in the second Gulf War. Also, the result shows that there is no statistical relationship between the operating style (Islamic or Conventional) and bank efficiency. Even though there is no statistical differences due to the operational style, but Islamic bank seem to be more efficient than the Conventional bank, since on average their efficiency score is 86.33% compare to 85.38% for Conventional banks. Furthermore, the Islamic banks seem to be more affected by the political crisis (second Gulf War), whereas Conventional banks seem to be more affected by the financial crisis.
Resumo:
Sentiment analysis over Twitter offer organisations a fast and effective way to monitor the publics' feelings towards their brand, business, directors, etc. A wide range of features and methods for training sentiment classifiers for Twitter datasets have been researched in recent years with varying results. In this paper, we introduce a novel approach of adding semantics as additional features into the training set for sentiment analysis. For each extracted entity (e.g. iPhone) from tweets, we add its semantic concept (e.g. Apple product) as an additional feature, and measure the correlation of the representative concept with negative/positive sentiment. We apply this approach to predict sentiment for three different Twitter datasets. Our results show an average increase of F harmonic accuracy score for identifying both negative and positive sentiment of around 6.5% and 4.8% over the baselines of unigrams and part-of-speech features respectively. We also compare against an approach based on sentiment-bearing topic analysis, and find that semantic features produce better Recall and F score when classifying negative sentiment, and better Precision with lower Recall and F score in positive sentiment classification.
Resumo:
We report the results of numerical studies of the impact of asymmetric femtosecond pulses focused in the bulk of the material on the femtosecond modification of fused silica. It is shown that such pulses lead to localisation of absorption in the process of femtosecond modification and to a decrease in the threshold energy of modification. It is found that the optimal asymmetry parameters for reaching the maximum plasma density in the focusing region depend on the pulse energy: at an initial energy of about 100 nJ, it is preferable to use pulses with positive TOD; however, when the energy is increased, it is preferable to use pulses with negative TOD. This is explained by differences in the dynamics of the processes of absorption of energy of a pulse propagating in the material.
Resumo:
IEEE 802.15.4 standard has been proposed for low power wireless personal area networks. It can be used as an important component in machine to machine (M2M) networks for data collection, monitoring and controlling functions. With an increasing number of machine devices enabled by M2M technology and equipped with 802.15.4 radios, it is likely that multiple 802.15.4 networks may be deployed closely, for example, to collect data for smart metering at residential or enterprise areas. In such scenarios, supporting reliable communications for monitoring and controlling applications is a big challenge. The problem becomes more severe due to the potential hidden terminals when the operations of multiple 802.15.4 networks are uncoordinated. In this paper, we investigate this problem from three typical scenarios and propose an analytic model to reveal how performance of coexisting 802.15.4 networks may be affected by uncoordinated operations under these scenarios. Simulations will be used to validate the analytic model. It is observed that uncoordinated operations may lead to a significant degradation of system performance in M2M applications. With the proposed analytic model, we also investigate the performance limits of the 802.15.4 networks, and the conditions under which coordinated operations may be required to support M2M applications. © 2012 Springer Science + Business Media, LLC.
Resumo:
An iterative procedure is proposed for the reconstruction of a temperature field from a linear stationary heat equation with stochastic coefficients, and stochastic Cauchy data given on a part of the boundary of a bounded domain. In each step, a series of mixed well-posed boundary-value problems are solved for the stochastic heat operator and its adjoint. Well-posedness of these problems is shown to hold and convergence in the mean of the procedure is proved. A discretized version of this procedure, based on a Monte Carlo Galerkin finite-element method, suitable for numerical implementation is discussed. It is demonstrated that the solution to the discretized problem converges to the continuous as the mesh size tends to zero.
Resumo:
We examined the role of priming participants' own network expectations on their subsequent identification with their friendship group. We examined this prime alongside attachment anxiety and attachment threat, as predictors of friendship group identification. Previous research has suggested that attachment anxiety is associated with negative network expectations. In this study, we extended this work to show that when a network expectation prime was absent, higher attachment anxiety was associated with lower group identification under attachment threat, compared to a control condition. However, when expectations of support network were primed, attachment threat no longer affected group identification, so that only attachment anxiety predicted group identification. This suggests that priming participants who are high in attachment anxiety with their own network expectancies (which are negative), results in participants dis-identifying with their friendship group, regardless of whether or not they have experienced attachment threat. © 2012 Elsevier Ltd.