923 resultados para pacs: information technolgy applications


Relevância:

40.00% 40.00%

Publicador:

Resumo:

Thesis (Ph.D.)--University of Washington, 2016-06

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The questions of distributed systems development based on Java RMI, EJB and J2EE technologies and tools are rated. Here is brought the comparative analysis, which determines the domain of an expedient demand of the considered information technologies as applied to the concrete distributed applications requirements.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

A tanulmány Magyarország egyik legnagyobb foglalkoztatójának megrendelésére készült abból a célból, hogy milyen megoldásokkal lehetne a vállalati működést hatékonyabbá tenni. Ennek keretében a szerzők megvizsgálták, hol tart ma a HR adatbányászati kutatás a világban. Milyen eszközök állnak rendelkezésre ahhoz, hogy a munkavállalói elmenetelt előre jelezzék, illetve figyeljék, valamint milyen lehetőség van a hálózati kutatások felhasználására a biztonság területén. Szerencsés, hogy a vállalkozói kérdések és erőforrások találkozhattak a kutatói szféra aktuális kutatási területeivel. A tanulmány szerzői úgy gondolják, hogy a cikkben megfogalmazott állítások, következtetések, eredmények a jövőben hasznosíthatók lesznek a vállalat és más cégek számára is. _____ The authors were pleased to take part in this research project initiated by one of Hungary’s largest employer. The goal of the project was to work out BI solutions to improve upon their business process. In the framework of the project first the authors made a survey on the current trends in the world of HR datamining. They reviewed the available tools for the prediction of employee promotion and investigated the question on how to utilize results achieved in social network analysis in the field of enterprise security. When real business problems and resources meet the mainstream research of the scientific community it is always a fortunate and it is rather fruitful. The authors are certain that the results published in this document will be beneficial for Foxconn in the near future. Of course, they are not done. There are continually new research perspectives opening up and huge amount of information is accumulating in the enterprises just waiting for getting discovered and analysed. Also the environment in which an enterprise operates is dynamically changing and thus the company faces new challenges and new type of business problems arise. The authors are in the hope that their research experience will help decision makers also in the future to solve real world business problems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this work we study an agent based model to investigate the role of asymmetric information degrees for market evolution. This model is quite simple and may be treated analytically since the consumers evaluate the quality of a certain good taking into account only the quality of the last good purchased plus her perceptive capacity beta. As a consequence, the system evolves according to a stationary Markov chain. The value of a good offered by the firms increases along with quality according to an exponent alpha, which is a measure of the technology. It incorporates all the technological capacity of the production systems such as education, scientific development and techniques that change the productivity rates. The technological level plays an important role to explain how the asymmetry of information may affect the market evolution in this model. We observe that, for high technological levels, the market can detect adverse selection. The model allows us to compute the maximum asymmetric information degree before the market collapses. Below this critical point the market evolves during a limited period of time and then dies out completely. When beta is closer to 1 (symmetric information), the market becomes more profitable for high quality goods, although high and low quality markets coexist. The maximum asymmetric information level is a consequence of an ergodicity breakdown in the process of quality evaluation. (C) 2011 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose - The purpose of this paper is to examine whether the level of logistics information systems (LIS) adoption in manufacturing companies is influenced by organizational profile variables, such as the company`s size, the nature of its operations and their subsectors. Design/methodology/approach - A review of the mainstream literature on US was carried out to identify the factors influencing the adoption of such information systems and also some research gaps. The empirical study`s strategy is based on a survey research in Brazilian manufacturing firms from the capital goods industry. Data collected were analyzed through Kruskall-Wallis and Mann Whitney`s non-parametric tests. Findings - The analysis indicates that characteristics such as the size of companies and the nature of their operations influence the levels of LIS adoption, whilst comparisons regarding the subsectors appeared to be of little influence. Originality/value - This is the first known study to examine the influence of organizational profiles such as size, nature of operations and subsector on the level of US adoption in manufacturing companies. Moreover, it is unique in portraying the Brazilian scenario on this topic and addressing the adoption of seven types of LIS in a single study.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Extracting human postural information from video sequences has proved a difficult research question. The most successful approaches to date have been based on particle filtering, whereby the underlying probability distribution is approximated by a set of particles. The shape of the underlying observational probability distribution plays a significant role in determining the success, both accuracy and efficiency, of any visual tracker. In this paper we compare approaches used by other authors and present a cost path approach which is commonly used in image segmentation problems, however is currently not widely used in tracking applications.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Data mining is the process to identify valid, implicit, previously unknown, potentially useful and understandable information from large databases. It is an important step in the process of knowledge discovery in databases, (Olaru & Wehenkel, 1999). In a data mining process, input data can be structured, seme-structured, or unstructured. Data can be in text, categorical or numerical values. One of the important characteristics of data mining is its ability to deal data with large volume, distributed, time variant, noisy, and high dimensionality. A large number of data mining algorithms have been developed for different applications. For example, association rules mining can be useful for market basket problems, clustering algorithms can be used to discover trends in unsupervised learning problems, classification algorithms can be applied in decision-making problems, and sequential and time series mining algorithms can be used in predicting events, fault detection, and other supervised learning problems (Vapnik, 1999). Classification is among the most important tasks in the data mining, particularly for data mining applications into engineering fields. Together with regression, classification is mainly for predictive modelling. So far, there have been a number of classification algorithms in practice. According to (Sebastiani, 2002), the main classification algorithms can be categorized as: decision tree and rule based approach such as C4.5 (Quinlan, 1996); probability methods such as Bayesian classifier (Lewis, 1998); on-line methods such as Winnow (Littlestone, 1988) and CVFDT (Hulten 2001), neural networks methods (Rumelhart, Hinton & Wiliams, 1986); example-based methods such as k-nearest neighbors (Duda & Hart, 1973), and SVM (Cortes & Vapnik, 1995). Other important techniques for classification tasks include Associative Classification (Liu et al, 1998) and Ensemble Classification (Tumer, 1996).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents the unique collection of additional features of Qu-Prolog, a variant of the Al programming language Prolog, and illustrates how they can be used for implementing DAI applications. By this we mean applications comprising communicating information servers, expert systems, or agents, with sophisticated reasoning capabilities and internal concurrency. Such an application exploits the key features of Qu-Prolog: support for the programming of sound non-clausal inference systems, multi-threading, and high level inter-thread message communication between Qu-Prolog query threads anywhere on the internet. The inter-thread communication uses email style symbolic names for threads, allowing easy construction of distributed applications using public names for threads. How threads react to received messages is specified by a disjunction of reaction rules which the thread periodically executes. A communications API allows smooth integration of components written in C, which to Qu-Prolog, look like remote query threads.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Some patients are no longer able to communicate effectively or even interact with the outside world in ways that most of us take for granted. In the most severe cases, tetraplegic or post-stroke patients are literally `locked in` their bodies, unable to exert any motor control after, for example, a spinal cord injury or a brainstem stroke, requiring alternative methods of communication and control. But we suggest that, in the near future, their brains may offer them a way out. Non-invasive electroencephalogram (EEG)-based brain-computer interfaces (BCD can be characterized by the technique used to measure brain activity and by the way that different brain signals are translated into commands that control an effector (e.g., controlling a computer cursor for word processing and accessing the internet). This review focuses on the basic concepts of EEG-based BC!, the main advances in communication, motor control restoration and the down-regulation of cortical activity, and the mirror neuron system (MNS) in the context of BCI. The latter appears to be relevant for clinical applications in the coming years, particularly for severely limited patients. Hypothetically, MNS could provide a robust way to map neural activity to behavior, representing the high-level information about goals and intentions of these patients. Non-invasive EEG-based BCIs allow brain-derived communication in patients with amyotrophic lateral sclerosis and motor control restoration in patients after spinal cord injury and stroke. Epilepsy and attention deficit and hyperactive disorder patients were able to down-regulate their cortical activity. Given the rapid progression of EEG-based BCI research over the last few years and the swift ascent of computer processing speeds and signal analysis techniques, we suggest that emerging ideas (e.g., MNS in the context of BC!) related to clinical neuro-rehabilitation of severely limited patients will generate viable clinical applications in the near future.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The field of protein crystallography inspires and enthrals, whether it be for the beauty and symmetry of a perfectly formed protein crystal, the unlocked secrets of a novel protein fold, or the precise atomic-level detail yielded from a protein-ligand complex. Since 1958, when the first protein structure was solved, there have been tremendous advances in all aspects of protein crystallography, from protein preparation and crystallisation through to diffraction data measurement and structure refinement. These advances have significantly reduced the time required to solve protein crystal structures, while at the same time substantially improving the quality and resolution of the resulting structures. Moreover, the technological developments have induced researchers to tackle ever more complex systems, including ribosomes and intact membrane-bound proteins, with a reasonable expectation of success. In this review, the steps involved in determining a protein crystal structure are described and the impact of recent methodological advances identified. Protein crystal structures have proved to be extraordinarily useful in medicinal chemistry research, particularly with respect to inhibitor design. The precise interaction between a drug and its receptor can be visualised at the molecular level using protein crystal structures, and this information then used to improve the complementarity and thus increase the potency and selectivity of an inhibitor. The use of protein crystal structures in receptor-based drug design is highlighted by (i) HIV protease, (ii) influenza virus neuraminidase and (iii) prostaglandin H-2-synthetase. These represent, respectively, examples of protein crystal structures that (i) influenced the design of drugs currently approved for use in the treatment of HIV infection, (ii) led to the design of compounds currently in clinical trials for the treatment of influenza infection and (iii) could enable the design of highly specific non-steroidal anti-inflammatory drugs that lack the common side-effects of this drug class.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background-Epicardial coronary injury is by far the most feared complication of epicardial ablation. Little information is available regarding the chronic effects of delivering radiofrequency in the vicinity of large coronary vessels, and the long-term impact of this approach for mapping and ablation on epicardial vessel integrity is poorly understood. Therefore, the aim of this study was to characterize the acute and chronic histopathologic changes produced by in vivo epicardial pulses of radiofrequency ablation on coronary artery of porcine hearts. Methods and Results-Seven pigs underwent a left thoracotomy. The catheter was sutured adjacent to the left anterior descending artery and left circumflex artery, and 20 pulses of radiofrequency energy were applied. Radiofrequency lesions located no more than 1 mm of the vessel were used for this analysis. Three animals were euthanized 20 days (acute phase) after the procedure and 4 animals after 70 days (chronic phase). The following parameters were obtained in each vessel analyzed: (1) internal and external perimeter; (2) vessel wall thickness; (3) tunica media thickness, and (4) tunica intima thickness. The presence of adipose tissue around the coronary arteries, the distance between the artery and the epicardium, and the anatomic relationship of the artery with the coronary vein was also documented for each section. Sixteen of 20 (80%) sections analyzed, showed intimal thickening with a mean of 0.18 +/- 0.14 mm compared with 0.13 +/- 0.16 mm in the acute phase (P=0.331). The mean tunica media thickness was 0.25 +/- 0.10 mm in the chronic phase animals compared with 0.18 +/- 0.03 mm in the acute phase animals (P=0.021). A clear protective effect of pericardial fat and coronary veins was also present. A positive correlation between depth of radiofrequency lesion and the degree of vessel injury expressed as intimal and media thickening (P=0.001) was present. A negative correlation was identified (r = -0.83; P=0.002) between intimal thickening and distance between epicardium and coronary artery. Conclusions-In this porcine model of in vivo epicardial radiofrequency ablation in proximity to coronary arteries leads to acute and chronic histopathologic changes characterized by tunica intima and media thickening, with replacement of smooth muscle cells with extracellular matrix, but no significant stenosis was observed up to 70 days after the ablation. The absence of acute coronary occlusion or injury does not preclude subsequent significant arterial damage, which frequently occurs when epicardial radiofrequency applications are delivered in close vicinity to the vessels. (Circ Arrhythm Electrophysiol. 2011;4:526-531.)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In population pharmacokinetic studies, the precision of parameter estimates is dependent on the population design. Methods based on the Fisher information matrix have been developed and extended to population studies to evaluate and optimize designs. In this paper we propose simple programming tools to evaluate population pharmacokinetic designs. This involved the development of an expression for the Fisher information matrix for nonlinear mixed-effects models, including estimation of the variance of the residual error. We implemented this expression as a generic function for two software applications: S-PLUS and MATLAB. The evaluation of population designs based on two pharmacokinetic examples from the literature is shown to illustrate the efficiency and the simplicity of this theoretic approach. Although no optimization method of the design is provided, these functions can be used to select and compare population designs among a large set of possible designs, avoiding a lot of simulations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

When the data consist of certain attributes measured on the same set of items in different situations, they would be described as a three-mode three-way array. A mixture likelihood approach can be implemented to cluster the items (i.e., one of the modes) on the basis of both of the other modes simultaneously (i.e,, the attributes measured in different situations). In this paper, it is shown that this approach can be extended to handle three-mode three-way arrays where some of the data values are missing at random in the sense of Little and Rubin (1987). The methodology is illustrated by clustering the genotypes in a three-way soybean data set where various attributes were measured on genotypes grown in several environments.