969 resultados para Compressão de dados (Telecomunicações)
Resumo:
INTRODUÇÃO: Estratégias efetivas para profilaxia do tromboembolismo venoso (TEV) são amplamente disponíveis, mas são ainda subutilizadas, principalmente no nosso meio. OBJETIVO: Avaliar o efeito da implementação de diretriz para profilaxia do TEV, em pacientes cirúrgicos, sobre a conduta da equipe de saúde na prescrição dessa profilaxia. Método. Estudo retrospectivo pré-intervenção - pós-intervenção. Prontuários de 150 pacientes antes e 150 depois da implementação de uma diretriz para a profilaxia (AID e DID) foram sorteados dentre pacientes de mais de 40 anos internados para cirurgia maior abdominal ou ortopédica. Foram registrados dados demográficos, referência a risco de TEV no prontuário, prescrição de profilaxia para TEV e diagnóstico de TEV durante a internação. RESULTADOS: Não houve diferença entre os dois grupos, AID e DID, quanto aos dados demográficos e ao tempo de profilaxia (5,6 x 6,6 dias). A frequência de profilaxia AID x DID antes da cirurgia foi: profilaxia farmacológica (PF), 6% x 9%; meias de compressão graduada (MCG), 4% x 3%; compressão pneumática intermitente (CPI), 2% x 3%. Após cirurgia: PF 53% x 53%; MCG, 23% x 40% (P<0,05); CPI, 26% x 32% . No total, AP, foi prescrita profilaxia para 60,5% dos pacientes AID e para 66,5% DID, mas a profilaxia foi considerada adequada em 34% dos pacientes AID e em 32% DID. Conclusão. A adoção do protocolo, embora com maior a preocupação com a profilaxia, traduzida pelo aumento na prescrição de MCG, melhorou minimamente sua qualidade, indicando a necessidade de outras intervenções ativas e contínuas para aumentar a aderência ao mesmo.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
No presente estudo, ajustou-se um modelo de regressão logística para prever a probabilidade de óbito de cães acometidos por gastroenterite hemorrágica. O modelo Logístico é recomendado para variáveis-resposta dicotômicas em estudo de Coorte. Registraram-se 176 animais censitariamente atendidos com gastroenterite hemorrágica em quatro clínicas veterinárias da cidade de Lavras, sul de Minas Gerais, entre os anos de 1992 e 1999. Após terem sido selecionadas por meio do teste de Pearson ou teste exato de Fisher, ajustou-se o modelo considerando-se as variáveis sexo, idade, diárias de internação e número de atendimentos. A estimação dos parâmetros foi feita pelo método da máxima verossimilhança. Conclui-se que quando os cães acometidos por gastroenterite hemorrágica são atendidos apenas uma vez, aqueles com idade superior a 6 meses possuem 15,45 vezes mais chances de morrerem (P<0,05) do que aqueles com menos de 6 meses de idade. Quando os animais que apresentam a enfermidade possuem mais de 6 meses de idade, a chance de morrerem, se forem atendidos apenas uma vez, é 20,251 vezes maior (P<0,05) do que se recebessem de 2 a 7 atendimentos.
Resumo:
The objective of the present article is to identify and discuss the possibilities of using qualitative data analysis software in the framework of procedures proposed by SDI (socio-discursive interactionism), emphasizing free distribuited software or free versions of commercial software. A literature review of software for qualitative data analysis in the area of social sciences and humanities, focusing on language studies is presented. Some tools, such as: Wef-tQDA, MLCT, Yoshikoder and Tropes are examined with their respective features and functions. The software called Tropes is examined in more detail because of its particular relation with language and semantic analysis, as well as its embeded classification of linguistic elements such as, types of verbs, adjectives, modalizations, etc. Although trying to completely automate an SDI based analysis is not feasible, the programs appear to be powerful helpers in analyzing specific questions. Still, it seems important to be familiar with software options and use different applications in order to obtain a more diversified vision of the data. It is up to the researcher to be critical of the analysis provided by the machine.
Resumo:
Deals with some common problems in structural analysis when calculating the experimental semi-variogram and fitting a semi-variogram model. Geochemical data were used and the following cases were studied: regular versus irregular sampling grade, presence of 'outliers' values, skew distributions due to a high variability of the data and estimation using a kriging procedure. -from English summary
Resumo:
Presents a study of the spectral response of a specific vegetative cover under the same soil elevation angle, but in different classes of slope, through Landsat transparencies. The site located in the region of Presidente Prudente was studied through topo sheets to define the classes of slope. Densitometric readings were obtained of selected areas, representing the terrain reflectance in different relief conditions. The cluster analysis was used to classify the densitometric data according to the classes of slope. The map of classes of slope/reflectance of the terrain surface showed a high correlation, mainly for the classes A (0-10%) and B (10-20%). -from English summary
Resumo:
Successful application of shallow seismic reflection method is related directly to the ability of the ground to transmit high-frequency seismic energy from a seismic source. The dominant frequencies of reflection data are in the range of 50 to 100 Hz and depend on the surface materials and water table. A spread of geophones send the siemsic signal to be recorded on a 12, 24 or 48 channel portable seismograph, using a single high-frequency geophone per channel. Although it is possible to find seismographs with display and processing unities, it is also possible to transfer digitalized data to a personal computer to be processed and interpreted by using specific programs. Some results of two recent field studies to specify the underground structure in the tidal flats area of Baie St. Paul (Quebec-Canada) and in glacial terrains, in the Waterloo (Ontario-Canada) region. -from English summary
Resumo:
Software in BASIC (GWBASIC, version 2.0) and TURBO PASCAL (version 3.0) is presented for PC type microcomputers with the purpose of calculating the graphical method for multivariate data according to Andrews. Applying both softwares to data from the Irati Formation mesossaurides skull measures, in velocity and graphical quality, the TURBO PASCAL language performed better. -after English summary
Resumo:
Crustal discontinuities may be seen as A-type collision sutures with triple junction arrays. Shear belts developed at the block borders due to oblique plate convergence. A consistent litho-structural zoning may be observed along the border zones of the blocks: the known high-grade terrains are exposed along the upper block border and pass to distal granite-greenstone terrains; in the lower block, granite-greenstone terrains form the older basement, and supracrustals occur as a metavolcano-sedimentary belt near or adjacent to the suture. This regional litho-structural framework may be related to diachronous collisions of sialic masses which lead to their amalgamation into an extensive continental mass. -from English summary
Resumo:
The merit of the Karhunen-Loève transform is well known. Since its basis is the eigenvector set of the covariance matrix, a statistical, not functional, representation of the variance in pattern ensembles is generated. By using the Karhunen-Loève transform coefficients as a natural feature representation of a character image, the eigenvector set can be regarded as an feature extractor for a classifier.
Resumo:
The nylon bag in situ degradation thecnique was employed to compare the data of the CNCPS to the ruminal degradability of the dry matter and crude protein in corn silage, soybean meal and sorghum grain, in four rumen fistulated Nellore steers, averaging 36 months of age and 520 kg of liveweight. A randomized complete block experimental design was used, where animals constituted the blocks. Two levels of concentrate, 18 and 39 %, were used in the diets. The forage used in the diets was corn silage and the concentrate ingredients were: soybean meal, cottonseed meal, corn grain and sorghum grain. There was a reduction in the potentially by degradable fraction (B) of the dry matter(DM) of the corn silage and sorghum grain with an increase in the concentrate level of the diet; however, the degradation rate(c) of the silage was similar for the two diets and the sorghum grain showed an increase of 28.4 %. The B fraction of the DM from the soybean meal was not affected by the diet, but its rate (c) was reduced by 18.1 %. The same effect was observed for the rate(c) of crude protein(CP) of the soybean meal, with a reduction of 38.1 %. The values for the effective degradability of the two fractions were not affected by the diet when the lag time was not considered. When lag time was considered, the degradability values of the studied feeds were superior in both fractions.
Resumo:
This article has the purpose to review the main codes used to detect and correct errors in data communication specifically in the computer's network. The Hamming's code and the Ciclic Redundancy Code (CRC) are presented as the focus of this article as well as CRC hardware implementation. Each code is reviewed in details in order to fill the gaps in the literature and to make it accessible to the computer science and engineering students as well as to anyone who may be interested in learning the technique to treat error in data communication.
Resumo:
The aim of this work was to evaluate the influence of compression force and humidity in the dissolution profile of tablets formulation. As hidroclorotyiazide presents real problems of bioavailability, it was incorporate as standard drug in a formulation of tablets to study the mechanical resistance, time of disintegration and dissolution profile in function of humidity and compression force. The time of disintegration was not affected by the compression force, but it was influenced by humidity. The dissolution profile was altered by the compression force and for the humidity as well. Both factors can alter the bioavailability of drugs dispensed in the form of tablets.
Resumo:
This research studies the influence of the pozzolanic activity of the calcareous and basalt in the resistence behavior of the compressive strength of high performance self-compacting concrete (HPSCC). The selected aditives are the calcareous filler and basalt filler, for they are industrial residues helping that way the sustainable development. The paste of this concrete type is constituted of cement, silica fume, calcareous filler or basalt filler, water and superplasticizer additive. In this research the relationships water/cement are fixed in 0,40 kg/kg, silica fume/cement of 0,10 kg/kg and the relationships filler/cement and superplasticizer/cement are determined through of Marsh́s cone and mini-slump tests. The granular skeleton is gotten from a composition between quartzous sand and brita of basalt that presents the lesser index of emptinesses. The results show that the HPSCC with the addition of calcareous filler has greater compressive strength than what the HPSCC with addition of basalt filler in the ages of 7, 28 and 63 days. It is explained by the fact that the calcareous filler presents greater index of pozzolanic activity than the basalt filler. Besides that the relation water/fine for the HPSCC with calcareous filler is 0,27 l/kg whereas the HPSCC with basalt filler is of 0,29 l/kg.
Resumo:
In this paper is presented a region-based methodology for Digital Elevation Model segmentation obtained from laser scanning data. The methodology is based on two sequential techniques, i.e., a recursive splitting technique using the quad tree structure followed by a region merging technique using the Markov Random Field model. The recursive splitting technique starts splitting the Digital Elevation Model into homogeneous regions. However, due to slight height differences in the Digital Elevation Model, region fragmentation can be relatively high. In order to minimize the fragmentation, a region merging technique based on the Markov Random Field model is applied to the previously segmented data. The resulting regions are firstly structured by using the so-called Region Adjacency Graph. Each node of the Region Adjacency Graph represents a region of the Digital Elevation Model segmented and two nodes have connectivity between them if corresponding regions share a common boundary. Next it is assumed that the random variable related to each node, follows the Markov Random Field model. This hypothesis allows the derivation of the posteriori probability distribution function whose solution is obtained by the Maximum a Posteriori estimation. Regions presenting high probability of similarity are merged. Experiments carried out with laser scanning data showed that the methodology allows to separate the objects in the Digital Elevation Model with a low amount of fragmentation.