928 resultados para Job Search Methods
Resumo:
We present a neoclassical model of capital accumulation with frictional labour markets. Under standard parameter values the equilibrium of the model is indeterminate and consequently displays expectations-driven business cycles – so-called endogenous business cycles. We study the properties of such cycles, and find that the model predicts the high autocorrelation in output growth and the hump-shaped impulse response of output found in US data – important features that existing endogenous real business cycle models fail to explain. The indeterminacy of the equilibrium stems from job search externalities and does not rely on increasing returns to scale as in most models.
Resumo:
This chapter focuses upon the careers of temporary workers. Temporary employment for many workers presents a route to permanent employment. Other workers, however, get trapped into temporary employment or cycle between unstable jobs and spells of unemployment. Predictors of such transitions are multiple. We selected two broad categories, namely perceived employability from the area of career research and health and well-being from the area of occupational health and well-being research. The overall conclusion is that the association between temporary employment and both perceived employability and health and well-being is inconclusive. This suggests that there are boundary conditions that may make some temporary workers successful and others not. Risk factors include dynamics related to the dual labor market, including lower job quality, lower investments on the part of employers, and negative stereotyping of temporary workers as second-class citizens. On the positive side, many temporary workers have learned to manage their careers in the sense that they invest in training and in continuous job search.
Resumo:
Cytochrome P450 (CYP450) is a class of enzymes where the substrate identification is particularly important to know. It would help medicinal chemists to design drugs with lower side effects due to drug-drug interactions and to extensive genetic polymorphism. Herein, we discuss the application of the 2D and 3D-similarity searches in identifying reference Structures with higher capacity to retrieve Substrates of three important CYP enzymes (CYP2C9, CYP2D6, and CYP3A4). On the basis of the complementarities of multiple reference structures selected by different similarity search methods, we proposed the fusion of their individual Tanimoto scores into a consensus Tanimoto score (T(consensus)). Using this new score, true positive rates of 63% (CYP2C9) and 81% (CYP2D6) were achieved with false positive rates of 4% for the CYP2C9-CYP2D6 data Set. Extended similarity searches were carried out oil a validation data set, and the results showed that by using the T(consensus) score, not only the area of a ROC graph increased, but also more substrates were recovered at the beginning of a ranked list.
Resumo:
To make oneself employable – instrumental identity positions and gendered marketability. The article takes up the question of the negotiation of expert discourses in career advice and constructing the self in the practice of CV writing. Drawing on data from semi-structured interviews with 19 students (11 women and 8 men) the stories about job search are analyzed. The concept of discursive positioning is used in order to analyze how the students position themselves in relation to career advice and position the self in CV writing. The results show that the female students had difficulties embodying the position as a marketing self as they described it as conflicting with feelings of ‘who they were’. ‘Being you’ in CVs and job interviews is, further, an ideal that is negotiated in relation to what to display as a job-seeking subject. CV writing involves a process of identifying suitable characteristics in an instrumental manner, but it is also combined with an introspective reasoning and identification to find ‘authentic’ strengths and characteristics in ‘who you are’.
Resumo:
Hurting to help or helping to hurt? The reservation wages of unemployed, job chances and reemployment incomes in Sweden Economic incentives and their impact on the job search behaviour of the unemployed have been a central focus in the academic and political debate in Sweden. A key concept has been the reservation wages of the unemployed, the lowest income at which an unemployed person would be willing to accept a job offer. Unemployment benefit systems have been argued to raise and maintain reservation wages at high levels that lower job chances. This has been supported by a large number of international studies. From this perspective lower reservation wages would function as protection against long term unemployment and the scarring effects associated with it. High reservation wages might however, based on the same behavioural assumptions, have a human capital preserving effect. The possibility to hold out for the right job should reduce human capital losses compared to accepting the first available job offer. In this article we use Swedish longitudinal micro data combining interview and register data in order to investigate three central aspects reservation wages in a Swedish context: factors influencing the setting of reservation wages, the effect of reservation wage on job chances and the impact of reservation wages on reemployment incomes. Our findings show that benefit level and pre-unemployment position in the wage structure are central factors for setting the reservation wage. The effects of reservation wages were however not the expected. No effects were found on job chances, while a strong positive effect was found on reemployment income. This together indicates that high reservation wages have a human capital preserving effect in Sweden.
Resumo:
Several empirical studies in the literature have documented the existence of a positive correlation between income inequalitiy and unemployment. I provide a theoretical framework under which this correlation can be better understood. The analysis is based on a dynamic job search under uncertainty. I start by proving the uniqueness of a stationary distribution of wages in the economy. Drawing upon this distribution, I provide a general expression for the Gini coefficient of income inequality. The expression has the advantage of not requiring a particular specification of the distribution of wage offers. Next, I show how the Gini coefficient varies as a function of the parameters of the model, and how it can be expected to be positively correlated with the rate of unemployment. Two examples are offered. The first, of a technical nature, to show that the convergence of the measures implied by the underlying Markov process can fail in some cases. The second, to provide a quantitative assessment of the model and of the mechanism linking unemployment and inequality.
Resumo:
By mixing together inequalities based on cyclical variables, such as unemployment, and on structural variables, such as education, usual measurements of income inequality add objects of a di§erent economic nature. Since jobs are not acquired or lost as fast as education or skills, this aggreagation leads to a loss of relavant economic information. Here I propose a di§erent procedure for the calculation of inequality. The procedure uses economic theory to construct an inequality measure of a long-run character, the calculation of which can be performed, though, with just one set of cross-sectional observations. Technically, the procedure is based on the uniqueness of the invariant distribution of wage o§ers in a job-search model. Workers should be pre-grouped by the distribution of wage o§ers they see, and only between-group inequalities should be considered. This construction incorporates the fact that the average wages of all workers in the same group tend to be equalized by the continuous turnover in the job market.
Resumo:
I use Search models to study decentralized markets of durable goods. I explore the concept of market liquidity of Lippman and McCall (1986) and show that the theory of optimal search is useful to address the following issues: What governs the time required to make a transaction on these markets? What is the relationship between the price of goods and the time required to make transactions? Why is optimal to wait to make a transaction in markets where individuals discount future utility? What is the socially optima search level? Two specifications are used, the traditional model of job search and a version of Krainer and LeRoy (2001) model.
Resumo:
This paper presents a study carried out with customers with credit card of a large retailer to measure the risk of abandonment of a relationship, when this has already purchase history. Two activities are the most important in this study: the theoretical and methodological procedures. The first step was to the understanding of the problem, the importance of theme and the definition of search methods. The study brings a bibliographic survey comprising several authors and shows that the loyalty of customers is the basis that gives sustainability and profitability for organizations of various market segments, examines the satisfaction as the key to success for achievement and specially for the loyalty of customers. To perform this study were adjusted logistic-linear models and through the test Kolmogorov - Smirnov (KS) and the curve Receiver Operating Characteristic (ROC) selected the best model. Had been used cadastral and transactional data of 100,000 customers of credit card issuer, the software used was SPSS which is a modern system of data manipulation, statistical analysis and presentation graphics. In research, we identify the risk of each customer leave the product through a score.
Resumo:
The frequency selective surfaces, or FSS (Frequency Selective Surfaces), are structures consisting of periodic arrays of conductive elements, called patches, which are usually very thin and they are printed on dielectric layers, or by openings perforated on very thin metallic surfaces, for applications in bands of microwave and millimeter waves. These structures are often used in aircraft, missiles, satellites, radomes, antennae reflector, high gain antennas and microwave ovens, for example. The use of these structures has as main objective filter frequency bands that can be broadcast or rejection, depending on the specificity of the required application. In turn, the modern communication systems such as GSM (Global System for Mobile Communications), RFID (Radio Frequency Identification), Bluetooth, Wi-Fi and WiMAX, whose services are highly demanded by society, have required the development of antennas having, as its main features, and low cost profile, and reduced dimensions and weight. In this context, the microstrip antenna is presented as an excellent choice for communications systems today, because (in addition to meeting the requirements mentioned intrinsically) planar structures are easy to manufacture and integration with other components in microwave circuits. Consequently, the analysis and synthesis of these devices mainly, due to the high possibility of shapes, size and frequency of its elements has been carried out by full-wave models, such as the finite element method, the method of moments and finite difference time domain. However, these methods require an accurate despite great computational effort. In this context, computational intelligence (CI) has been used successfully in the design and optimization of microwave planar structures, as an auxiliary tool and very appropriate, given the complexity of the geometry of the antennas and the FSS considered. The computational intelligence is inspired by natural phenomena such as learning, perception and decision, using techniques such as artificial neural networks, fuzzy logic, fractal geometry and evolutionary computation. This work makes a study of application of computational intelligence using meta-heuristics such as genetic algorithms and swarm intelligence optimization of antennas and frequency selective surfaces. Genetic algorithms are computational search methods based on the theory of natural selection proposed by Darwin and genetics used to solve complex problems, eg, problems where the search space grows with the size of the problem. The particle swarm optimization characteristics including the use of intelligence collectively being applied to optimization problems in many areas of research. The main objective of this work is the use of computational intelligence, the analysis and synthesis of antennas and FSS. We considered the structures of a microstrip planar monopole, ring type, and a cross-dipole FSS. We developed algorithms and optimization results obtained for optimized geometries of antennas and FSS considered. To validate results were designed, constructed and measured several prototypes. The measured results showed excellent agreement with the simulated. Moreover, the results obtained in this study were compared to those simulated using a commercial software has been also observed an excellent agreement. Specifically, the efficiency of techniques used were CI evidenced by simulated and measured, aiming at optimizing the bandwidth of an antenna for wideband operation or UWB (Ultra Wideband), using a genetic algorithm and optimizing the bandwidth, by specifying the length of the air gap between two frequency selective surfaces, using an optimization algorithm particle swarm
Resumo:
Chronic venous disease (CVD) is evident among the chronic diseases and affects the elderly population and primarily is responsible for leg ulcers in this population. The use of dressings in the care of a venous ulcer is a fundamental part of the treatment for healing, however, evidence to assist in choosing the best dressing is scarce. The main objective of this study was to evaluate the effectiveness of treatment with hydrogel in the healing of venous ulcers using search methods, synthesis of information and statistical research through a systematic review and meta-analysis. Randomized controlled trials were selected in the following databases: CENTRAL; DARE; NHS EED; MEDLINE; EMBASE; CINAHL. Beyond these databases three websites were consulted to identify ongoing studies: ClinicalTrials.gov, OMS ICTRP e ISRCTN. The primary outcomes were analyzed: complete wound healing, incidence of wound infection and the secondary were: changes in ulcer size, time to ulcer healing, recurrence of ulcer, quality of life of participants, pain and costs of treatment. Four studies are currently included in the review with a total of 250 participants. The use of hydrogel appears to be superior to conventional dressing, gauze soaked in saline, for the healing of venous leg ulcers; 16/30 patients showed complete healing of ulcers (RR 5,33, 95%CI [1,73,16,42]). The alginate gel was shown to be more effective when compared to the hydrogel dressing in reduction of the wound area; 61,2% (± 26,2%) with alginate e 19,4% (± 24,3%) with hydrogel at the end of four weeks of treatment. Manuka honey has shown to be similar to the hydrogel dressings in percentage of area reduction. This review demonstrated that there is no evidence available about the effectiveness of the hydrogel compared to other types of dressings on the healing of venous leg ulcers of the lower limbs, thus demonstrating the need of future studies to assist health professionals in choosing the correct dressing.
Resumo:
This paper proposes a new approach and coding scheme for solving economic dispatch problems (ED) in power systems through an effortless hybrid method (EHM). This novel coding scheme can effectively prevent futile searching and also prevents obtaining infeasible solutions through the application of stochastic search methods, consequently dramatically improves search efficiency and solution quality. The dominant constraint of an economic dispatch problem is power balance. The operational constraints, such as generation limitations, ramp rate limits, prohibited operating zones (POZ), network loss are considered for practical operation. Firstly, in the EHM procedure, the output of generator is obtained with a lambda iteration method and without considering POZ and later in a genetic based algorithm this constraint is satisfied. To demonstrate its efficiency, feasibility and fastness, the EHM algorithm was applied to solve constrained ED problems of power systems with 6 and 15 units. The simulation results obtained from the EHM were compared to those achieved from previous literature in terms of solution quality and computational efficiency. Results reveal that the superiority of this method in both aspects of financial and CPU time. (C) 2011 Elsevier Ltd. All rights reserved.
Resumo:
Pós-graduação em Bases Gerais da Cirurgia - FMB
Resumo:
Este trabalho constitui-se num estudo sobre o curso de formação de professores de Química da UFPA, contando sua história a partir dos desenhos curriculares que o nortearam nos seus 30 anos de existência e inserindo tal história no contexto maior da construção da ciência moderna. O estudo prossegue analisando, à luz da literatura, o projeto político-pedagógico recém aprovado pelo Colegiado do Curso visando detectar possíveis avanços e conclui respondendo, a partir do quadro delineado pela história do Curso e pelo momento presente, à questão: em função das demandas impostas pela sociedade moderna, como deve ser formado, hoje, um professor de química? Foram usados como principais métodos de pesquisa, coleta de depoimentos mediante entrevistas semi-estruturadas e pesquisa documental e bibliográfica, com maior ênfase a esta última.
Resumo:
Este trabalho apresenta um método para encontrar um conjunto de pontos de operação, os quais são ótimos de Pareto com diversidade, para linhas digitais de assinante (DSL - digital subscriber line). Em diversos trabalhos encontrados na literatura, têm sido propostos algoritmos para otimização da transmissão de dados em linhas DSL, que fornecem como resultado apenas um ponto de operação para os modems. Esses trabalhos utilizam, em geral, algoritmos de balanceamento de espectro para resolver um problema de alocação de potência, o que difere da abordagem apresentada neste trabalho. O método proposto, chamado de diverseSB , utiliza um processo híbrido composto de um algoritmo evolucionário multiobjetivo (MOEA - multi-objective evolutionary algorithm), mais precisamente, um algoritmo genético com ordenamento por não-dominância (NSGA-II - Non-Dominated Sorting Genetic Algorithm II), e usando ainda, um algoritmo de balanceamento de espectro. Os resultados obtidos por simulações mostram que, para uma dada diversidade, o custo computacional para determinar os pontos de operação com diversidade usando o algoritmo diverseSB proposto é muito menor que métodos de busca de “força bruta”. No método proposto, o NSGA-II executa chamadas ao algoritmo de balanceamento de espectro adotado, por isso, diversos testes envolvendo o mesmo número de chamadas ao algoritmo foram realizadas com o método diverseSB proposto e o método de busca por força bruta, onde os resultados obtidos pelo método diverseSB proposto foram bem superiores do que os resultados do método de busca por força bruta. Por exemplo, o método de força bruta realizando 1600 chamadas ao algoritmo de balanceamento de espectro, obtém um conjunto de pontos de operação com diversidade semelhante ao do método diverseSB proposto com 535 chamadas.