918 resultados para Sequential auctions


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Objective To perform spectral analysis of noise generated by equipments and activities in a level III neonatal intensive care unit (NICU) and measure the real time sequential hourly noise levels over a 15 day period. Methods Noise generated in the NICU by individual equipments and activities were recorded with a digital spectral sound analyzer to perform spectral analysis over 0.5–8 KHz. Sequential hourly noise level measurements in all the rooms of the NICU were done for 15 days using a digital sound pressure level meter. Independent sample t test and one way ANOVA were used to examine the statistical significance of the results. The study has a 90% power to detect at least 4 dB differences from the recommended maximum of 50 dB with 95 % confidence. Results The mean noise levels in the ventilator room and stable room were 19.99 dB (A) sound pressure level (SPL) and 11.81 dB (A) SPL higher than the maximum recommended of 50 dB (A) respectively (p < 0.001). The equipments generated 19.11 dB SPL higher than the recommended norms in 1–8 KHz spectrum. The activities generated 21.49 dB SPL higher than the recommended norms in 1–8 KHz spectrum (p< 0.001). The ventilator and nebulisers produced excess noise of 8.5 dB SPL at the 0.5 KHz spectrum.Conclusion Noise level in the NICU is unacceptably high. Spectral analysis of equipment and activity noise have shown noise predominantly in the 1–8 KHz spectrum. These levels warrant immediate implementation of noise reduction protocols as a standard of care in the NICU.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper gives a new iterative algorithm for kernel logistic regression. It is based on the solution of a dual problem using ideas similar to those of the Sequential Minimal Optimization algorithm for Support Vector Machines. Asymptotic convergence of the algorithm is proved. Computational experiments show that the algorithm is robust and fast. The algorithmic ideas can also be used to give a fast dual algorithm for solving the optimization problem arising in the inner loop of Gaussian Process classifiers.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Action, Power and Experience in Organizational Change - A Study of Three Major Corporations This study explores change management and resistance to change as social activities and power displays through worker experiences in three major Finnish corporations. Two important sensitizing concepts were applied. Firstly, Richard Sennett's perspective on work in the new form of capitalism, and its shortcomings - the lack of commitment and freedom accompanied by the disruption to lifelong career planning and the feeling of job insecurity - offered a fruitful starting point for a critical study. Secondly, Michel Foucault's classical concept of power, treated as anecdotal, interactive and nonmeasurable, provided tools for analyzing change-enabling and resisting acts. The study bridges the gap between management and social sciences. The former have usually concentrated on leadership issues, best practices and goal attainment, while the latter have covered worker experiences, power relations and political conflicts. The study was motivated by three research questions. Firstly, why people resist or support changes in their work, work environment or organization, and the kind of analyses these behavioural choices are based on. Secondly, the kind of practical forms which support for, and resistance to change take, and how people choose the different ways of acting. Thirdly, how the people involved experience and describe their own subject position and actions in changing environments. The examination focuses on practical interpretations and action descriptions given by the members of three major Finnish business organizations. The empirical data was collected during a two-year period in the Finnish Post Corporation, the Finnish branch of Vattenfal Group, one of the leading European energy companies, and the Mehiläinen Group, the leading private medical service provider in Finland. It includes 154 non-structured thematic interviews and 309 biographies concentrating on personal experiences of change. All positions and organizational levels were represented. The analysis was conducted using the grounded theory method introduced by Straus and Corbin in three sequential phases, including open, axial and selective coding processes. As a result, there is a hierarchical structure of categories, which is summarized in the process model of change behaviour patterns. Key ingredients are past experiences and future expectations which lead to different change relations and behavioural roles. Ultimately, they contribute to strategic and tactical choices realized as both public and hidden forms of action. The same forms of action can be used in both supporting and resisting change, and there are no specific dividing lines either between employer and employee roles or between different hierarchical positions. In general, however, it is possible to conclude that strategic choices lead more often to public forms of action, whereas tactical choices result in hidden forms. The primary goal of the study was to provide knowledge which has practical applications in everyday business life, HR and change management. The results, therefore, are highly applicable to other organizations as well as to less change-dominated situations, whenever power relations and conflicting interests are present. A sociological thesis on classical business management issues can be of considerable value in revealing the crucial social processes behind behavioural patterns. Keywords: change management, organizational development, organizational resistance, resistance to change, change management, labor relations, organization, leadership

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The goals of this article are to integrate action regulation theory (ART) with the lifespan developmental perspective and to outline tenets of a new metatheory of work and aging. The action regulation across the adult lifespan (ARAL) theory explains how workers influence, and are influenced by, their environment across different time spans. First, the basic concepts of ART are described, including the sequential and hierarchical structure of actions, complete tasks and actions, foci of action regulation, and the action-regulating mental model. Second, principles of the lifespan developmental perspective are delineated, including development as a lifelong and multidirectional process, the joint occurrence of gains and losses, intraindividual plasticity, historical embeddedness, and contextualism. Third, propositions of ARAL theory are derived by analyzing workers’ action regulation from a lifespan developmental perspective (i.e., effects of aging on action regulation), and by analyzing aging and development in the work context from an ART perspective (i.e., effects of action regulation on age-related changes in cognition and personality). Fourth, we develop further propositions to integrate ART with lifespan theories of motivation and socioemotional experience. Finally, we discuss implications for future research and practice based on ARAL theory.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Assessment of heavy metal bioavailability in sediments is complex because of the number of partial extraction methods available for the assessment and the general lack of certified reference materials. This study evaluates five different extraction methodologies to ascertain the relative strengths and weaknesses of each method. The results are then compared to previously published work to ascertain the most effective partial extraction technique, which was established to be dilute (0.75 – 1 M) nitric acid solutions. These results imply that single reagent; weak acid extractions provide a better assessment of potentially bioavailable metals than the chelating agents used in sequential extraction methods.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The conformational properties of foldamers generated from alpha gamma hybrid peptide sequences have been probed in the model sequence Boc-Aib-Gpn-Aib-Gpn-NHMe. The choice of alpha-aminoisobutyryl (Aib) and gabapentin (Gpn) residues greatly restricts sterically accessible coil formational space. This model sequence was anticipated to be a short segment of the alpha gamma C-12 helix, stabilized by three successive 4 -> 1 hydrogen bonds, corresponding to a backbone-expanded analogue of the alpha polypeptide 3(10)-helix. Unexpectedly, three distinct crystalline polymorphs were characterized in the solid state by X-ray diffraction. In one form, two successive C-12 hydrogen bonds were obtained at the N-terminus, while a novel C-17 hydrogen-bonded gamma alpha gamma turn was observed at the C-terminus. In the other two polymorphs, isolated C-9 and C-7 hydrogen-bonded turns were observed at Gpn (2) and Gpn (4). Isolated C-12 and C-9 turns were also crystallographically established in the peptides Boc-Aib-Gpn-Aib-OMe and Boc-Gpn-Aib-NHMe, respectively. Selective line broadening of NH resonances and the observation of medium range NH(i)<-> NH(i+2) NOEs established the presence of conformational heterogeneity for the tetrapeptide in CDCl3 solution. The NMR results are consistent with the limited population of the continuous C-12 helix conformation. Lengthening of the (alpha gamma)(n) sequences in the nonapeptides Boc-Aib-Gpn-Aib-Gpn-Aib-Gpn-Aib-Gpn-Xxx (Xxx = Aib, Leu) resulted in the observation of all of the sequential NOEs characteristic of an alpha gamma C-12 helix. These results establish that conformational fragility is manifested in short hybrid alpha gamma sequences despite the choice of conformationally constrained residues, while stable helices are formed on chain extension.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The dissertation examines how emotional experiences are oriented to in the details of psychotherapeutic interaction. The data (57 audio recorded sessions) come from one therapist-patient dyad in cognitive psychotherapy. Conversation analysis is used as method. The dissertation consists of 4 original articles and a summary. The analyses explicate the therapist s practices of responding to the patient s affective expressions. Different types of affiliating responses are identified. It is shown that the affiliating responses are combined with, or build grounds for, more interpretive and challenging actions. The study also includes a case study of a session with strong misalignment between the therapist s and patient s orientations, showing how this misalignment is managed by the therapist. Moreover, through a longitudinal analysis of the transformation of a sequence type, the study suggests that therapeutic change processes can be located to sequential relations of actions. The practices found in this study are compared to earlier research on everyday talk and on medical encounters. It is suggested that in psychotherapeutic interaction, the generic norms of interaction considering affiliation and epistemic access, are modified for the purposes of therapeutic work. The study also shows that the practices of responding to emotional experience in psychotherapy can deviate from the everyday practices of affiliation. The results of the study are also discussed in terms of concepts arising from clinical theory. These include empathy, validation of emotion, therapeutic alliance, interpretation, challenging beliefs, and therapeutic change. The therapist s approach described in this study involves practical integration of different clinical theories. In general terms, the study suggests that in the details of interaction, psychotherapy recurrently performs a dual task of empathy and challenging in relation to the patient s ways of describing their experiences. Methodologically, the study discusses the problem of identifying actions in conversation analysis of psychotherapy and emotional interaction, and the possibility to apply conversation analysis in the study of therapeutic change.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Hall thrusters, such as Stationary Plasma Thruster (SPT), have been widely used on board modern satellites placed in geo-synchronows orbits for reasons such as orbit maintenance, repositioning and attitude control. In order to study the performance of the stationary plasma thruster, the thrust produced by it has been measured, using a thrust balance with strain gauge sensors under vacuum conditions, by activating the thruster. This activation of thruster has been carried out by switching ON and switching OFF of the necessary power supplies and control of other feed system such as the propellant flow in a particular sequence. Hitherto, these operations were done manually in the required sequence. This paper reports the attempt made to automate the sequential operation of the power supplies and the necessary control valves of the feed system using Intel 8051 microcontroller. This automation has made thrust measurements easier and more sophisticated.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A hybrid computer for structure factor calculations in X-ray crystallography is described. The computer can calculate three-dimensional structure factors of up to 24 atoms in a single run and can generate the scatter functions of well over 100 atoms using Vand et al., or Forsyth and Wells approximations. The computer is essentially a digital computer with analog function generators, thus combining to advantage the economic data storage of digital systems and simple computing circuitry of analog systems. The digital part serially selects the data, computes and feeds the arguments into specially developed high precision digital-analog function generators, the outputs of which being d.c. voltages, are further processed by analog circuits and finally the sequential adder, which employs a novel digital voltmeter circuit, converts them back into digital form and accumulates them in a dekatron counter which displays the final result. The computer is also capable of carrying out 1-, 2-, or 3-dimensional Fourier summation, although in this case, the lack of sufficient storage space for the large number of coefficients involved, is a serious limitation at present.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We combine results from searches by the CDF and D0 collaborations for a standard model Higgs boson (H) in the process gg->H->W+W- in p=pbar collisions at the Fermilab Tevatron Collider at sqrt{s}=1.96 TeV. With 4.8 fb-1 of integrated luminosity analyzed at CDF and 5.4 fb-1 at D0, the 95% Confidence Level upper limit on \sigma(gg->H) x B(H->W+W-) is 1.75 pb at m_H=120 GeV, 0.38 pb at m_H=165 GeV, and 0.83 pb at m_H=200 GeV. Assuming the presence of a fourth sequential generation of fermions with large masses, we exclude at the 95% Confidence Level a standard-model-like Higgs boson with a mass between 131 and 204 GeV.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Market microstructure is “the study of the trading mechanisms used for financial securities” (Hasbrouck (2007)). It seeks to understand the sources of value and reasons for trade, in a setting with different types of traders, and different private and public information sets. The actual mechanisms of trade are a continually changing object of study. These include continuous markets, auctions, limit order books, dealer markets, or combinations of these operating as a hybrid market. Microstructure also has to allow for the possibility of multiple prices. At any given time an investor may be faced with a multitude of different prices, depending on whether he or she is buying or selling, the quantity he or she wishes to trade, and the required speed for the trade. The price may also depend on the relationship that the trader has with potential counterparties. In this research, I touch upon all of the above issues. I do this by studying three specific areas, all of which have both practical and policy implications. First, I study the role of information in trading and pricing securities in markets with a heterogeneous population of traders, some of whom are informed and some not, and who trade for different private or public reasons. Second, I study the price discovery of stocks in a setting where they are simultaneously traded in more than one market. Third, I make a contribution to the ongoing discussion about market design, i.e. the question of which trading systems and ways of organizing trading are most efficient. A common characteristic throughout my thesis is the use of high frequency datasets, i.e. tick data. These datasets include all trades and quotes in a given security, rather than just the daily closing prices, as in traditional asset pricing literature. This thesis consists of four separate essays. In the first essay I study price discovery for European companies cross-listed in the United States. I also study explanatory variables for differences in price discovery. In my second essay I contribute to earlier research on two issues of broad interest in market microstructure: market transparency and informed trading. I examine the effects of a change to an anonymous market at the OMX Helsinki Stock Exchange. I broaden my focus slightly in the third essay, to include releases of macroeconomic data in the United States. I analyze the effect of these releases on European cross-listed stocks. The fourth and last essay examines the uses of standard methodologies of price discovery analysis in a novel way. Specifically, I study price discovery within one market, between local and foreign traders.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The negative relationship between economic growth and stock market return is not an anomaly according to evidence documented in many economies. It is argued that future economic growth is largely irrelevant for predicting future equity returns, since long-run equity returns depend mainly on dividend yields and the growth of per share dividends. The economic growth does result in a higher standard of living for consumers, but does not necessarily translate into higher returns for owners of the capital. The divergence in performance between the real sector and stock markets appears to support the above argument. However, this thesis strives to offer an alternative explanation to the apparent divergence within the framework of corporate governance. It argues that weak corporate governance standards in Chinese listed firms exacerbated by poor inventor protection results into a marginalized capital market. Each of the three essays in the thesis addresses one particular aspect of corporate governance on the Chinese stock market in a sequential way through gathering empirical evidence on three distinctive stock market activities. The first essay questions whether significant agency conflicts do exist by building a game on rights issues. It documents significant divergence in interests among shareholders holding different classes of shares. The second essay investigates the level of agency costs by examining value of control through constructing a sample of block transactions. It finds that block transactions that transfer ultimate control entail higher premiums. The third essay looks into possible avenues through which corporate governance standards could be improved by investigating the economic consequences of cross-listing on the Chinese stock market. It finds that, by adopting a higher disclosure standard through cross-listings, firms voluntarily commit themselves to reducing information asymmetry, and consequently command higher valuation than their counterparts.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A fast algorithm for the computation of maximum compatible classes (mcc) among the internal states of an incompletely specified sequential machine is presented in this paper. All the maximum compatible classes are determined by processing compatibility matrices of progressingly diminishing order, whose total number does not exceed (p + m), where p is the largest cardinality among these classes, and m is the number of such classes. Consequently the algorithm is specially suitable for the state minimization of very large sequential machines as encountered in vlsi circuits and systems.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper, we present a growing and pruning radial basis function based no-reference (NR) image quality model for JPEG-coded images. The quality of the images are estimated without referring to their original images. The features for predicting the perceived image quality are extracted by considering key human visual sensitivity factors such as edge amplitude, edge length, background activity and background luminance. Image quality estimation involves computation of functional relationship between HVS features and subjective test scores. Here, the problem of quality estimation is transformed to a function approximation problem and solved using GAP-RBF network. GAP-RBF network uses sequential learning algorithm to approximate the functional relationship. The computational complexity and memory requirement are less in GAP-RBF algorithm compared to other batch learning algorithms. Also, the GAP-RBF algorithm finds a compact image quality model and does not require retraining when the new image samples are presented. Experimental results prove that the GAP-RBF image quality model does emulate the mean opinion score (MOS). The subjective test results of the proposed metric are compared with JPEG no-reference image quality index as well as full-reference structural similarity image quality index and it is observed to outperform both.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Separation of printed text blocks from the non-text areas, containing signatures, handwritten text, logos and other such symbols, is a necessary first step for an OCR involving printed text recognition. In the present work, we compare the efficacy of some feature-classifier combinations to carry out this separation task. We have selected length-nomalized horizontal projection profile (HPP) as the starting point of such a separation task. This is with the assumption that the printed text blocks contain lines of text which generate HPP's with some regularity. Such an assumption is demonstrated to be valid. Our features are the HPP and its two transformed versions, namely, eigen and Fisher profiles. Four well known classifiers, namely, Nearest neighbor, Linear discriminant function, SVM's and artificial neural networks have been considered and efficiency of the combination of these classifiers with the above features is compared. A sequential floating feature selection technique has been adopted to enhance the efficiency of this separation task. The results give an average accuracy of about 96.