938 resultados para Computer Network


Relevância:

60.00% 60.00%

Publicador:

Resumo:

Esta pesquisa analisou as oportunidades das mulheres em termos de igualdade na ocupação de cargos de alta chefia na Câmara dos Deputados. A pesquisa de campo foi realizada entre novembro de 2005 e janeiro de 2006 e a amostra abrangeu 1.320 participantes. A Escala de Percepção de Igualdade de Oportunidades entre Mulheres e Homens foi aplicada para analisar a atitude dos servidores quanto a possibilidades e limites de ascensão da mulher na estrutura de cargos da organização. A escala constava de 34 itens, com respostas tipo Likert, e 6 itens sobre dados demográficos. A coleta de dados foi feita pela rede interna de computadores da Câmara dos Deputados, através de e-mail enviado aos 3.597 servidores do quadro efetivo. Os dados foram submetidos à análise dos componentes principais (ACP), rotação promax, com resultados meritórios (KMO = 86,0; Bartlett: 14894,879), para 4 fatores. Observaram-se diferenças entre mulheres e homens quanto aos escores obtidos. As oportunidades de ocupar cargo de alta chefia são menores para elas do que para eles. Há mais gerentes do sexo masculino, e os servidores, especialmente as mulheres que não ocupam posto de chefia, percebem que não há igualdade de oportunidades entre funcionários de ambos os sexos. ___________________________________________________________________________ ABSTRACT

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A replicação de base de dados tem como objectivo a cópia de dados entre bases de dados distribuídas numa rede de computadores. A replicação de dados é importante em várias situações, desde a realização de cópias de segurança da informação, ao balanceamento de carga, à distribuição da informação por vários locais, até à integração de sistemas heterogéneos. A replicação possibilita uma diminuição do tráfego de rede, pois os dados ficam disponíveis localmente possibilitando também o seu acesso no caso de indisponibilidade da rede. Esta dissertação baseia-se na realização de um trabalho que consistiu no desenvolvimento de uma aplicação genérica para a replicação de bases de dados a disponibilizar como open source software. A aplicação desenvolvida possibilita a integração de dados entre vários sistemas, com foco na integração de dados heterogéneos, na fragmentação de dados e também na possibilidade de adaptação a várias situações. ABSTRACT: Data replication is a mechanism to synchronize and integrate data between distributed databases over a computer network. Data replication is an important tool in several situations, such as the creation of backup systems, load balancing between various nodes, distribution of information between various locations, integration of heterogeneous systems. Replication enables a reduction in network traffic, because data remains available locally even in the event of a temporary network failure. This thesis is based on the work carried out to develop an application for database replication to be made accessible as open source software. The application that was built allows for data integration between various systems, with particular focus on, amongst others, the integration of heterogeneous data, the fragmentation of data, replication in cascade, data format changes between replicas, master/slave and multi master synchronization.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

We present an intuitive geometric approach for analysing the structure and fragility of T1-weighted structural MRI scans of human brains. Apart from computing characteristics like the surface area and volume of regions of the brain that consist of highly active voxels, we also employ Network Theory in order to test how close these regions are to breaking apart. This analysis is used in an attempt to automatically classify subjects into three categories: Alzheimer’s disease, mild cognitive impairment and healthy controls, for the CADDementia Challenge.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In this paper, a computer-aided diagnostic (CAD) system for the classification of hepatic lesions from computed tomography (CT) images is presented. Regions of interest (ROIs) taken from nonenhanced CT images of normal liver, hepatic cysts, hemangiomas, and hepatocellular carcinomas have been used as input to the system. The proposed system consists of two modules: the feature extraction and the classification modules. The feature extraction module calculates the average gray level and 48 texture characteristics, which are derived from the spatial gray-level co-occurrence matrices, obtained from the ROIs. The classifier module consists of three sequentially placed feed-forward neural networks (NNs). The first NN classifies into normal or pathological liver regions. The pathological liver regions are characterized by the second NN as cyst or "other disease." The third NN classifies "other disease" into hemangioma or hepatocellular carcinoma. Three feature selection techniques have been applied to each individual NN: the sequential forward selection, the sequential floating forward selection, and a genetic algorithm for feature selection. The comparative study of the above dimensionality reduction methods shows that genetic algorithms result in lower dimension feature vectors and improved classification performance.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Transportation Department, Office of University Research, Washington, D.C.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Bibliography: p. 43.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Bibliography: p. 25-28.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Issued also as thesis (M.S.) University of Illinois.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The development of new learning models has been of great importance throughout recent years, with a focus on creating advances in the area of deep learning. Deep learning was first noted in 2006, and has since become a major area of research in a number of disciplines. This paper will delve into the area of deep learning to present its current limitations and provide a new idea for a fully integrated deep and dynamic probabilistic system. The new model will be applicable to a vast number of areas initially focusing on applications into medical image analysis with an overall goal of utilising this approach for prediction purposes in computer based medical systems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The search for an Alzheimer's disease (AD) biomarker is one of the most relevant contemporary research topics due to the high prevalence and social costs of the disease. Functional connectivity (FC) of the default mode network (DMN) is a plausible candidate for such a biomarker. We evaluated 22 patients with mild AD and 26 age- and gender-matched healthy controls. All subjects underwent resting functional magnetic resonance imaging (fMRI) in a 3.0 T scanner. To identify the DMN, seed-based FC of the posterior cingulate was calculated. We also measured the sensitivity/specificity of the method, and verified a correlation with cognitive performance. We found a significant difference between patients with mild AD and controls in average z-scores: DMN, whole cortical positive (WCP) and absolute values. DMN individual values showed a sensitivity of 77.3% and specificity of 70%. DMN and WCP values were correlated to global cognition and episodic memory performance. We showed that individual measures of DMN connectivity could be considered a promising method to differentiate AD, even at an early phase, from normal aging. Further studies with larger numbers of participants, as well as validation of normal values, are needed for more definitive conclusions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Nowadays, digital computer systems and networks are the main engineering tools, being used in planning, design, operation, and control of all sizes of building, transportation, machinery, business, and life maintaining devices. Consequently, computer viruses became one of the most important sources of uncertainty, contributing to decrease the reliability of vital activities. A lot of antivirus programs have been developed, but they are limited to detecting and removing infections, based on previous knowledge of the virus code. In spite of having good adaptation capability, these programs work just as vaccines against diseases and are not able to prevent new infections based on the network state. Here, a trial on modeling computer viruses propagation dynamics relates it to other notable events occurring in the network permitting to establish preventive policies in the network management. Data from three different viruses are collected in the Internet and two different identification techniques, autoregressive and Fourier analyses, are applied showing that it is possible to forecast the dynamics of a new virus propagation by using the data collected from other viruses that formerly infected the network. Copyright (c) 2008 J. R. C. Piqueira and F. B. Cesar. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In many real situations, randomness is considered to be uncertainty or even confusion which impedes human beings from making a correct decision. Here we study the combined role of randomness and determinism in particle dynamics for complex network community detection. In the proposed model, particles walk in the network and compete with each other in such a way that each of them tries to possess as many nodes as possible. Moreover, we introduce a rule to adjust the level of randomness of particle walking in the network, and we have found that a portion of randomness can largely improve the community detection rate. Computer simulations show that the model has good community detection performance and at the same time presents low computational complexity. (C) 2008 American Institute of Physics.