908 resultados para average complexity
Resumo:
A partir de 2002 o Estado assume o esforço de normatizar a atenção às urgências com edição de Portarias e documentos. O SAMU foi o primeiro componente da política implantado. Ele opera com ambulâncias com ou sem médico e com recursos tecnológicos diversos. Este estudo teve como objetivo analisar o potencial de prática de integralidade no SAMU. Para tal, foram realizadas três etapas de trabalho. Analisou-se a política de urgência a partir dos documentos e Portarias que a compõem. No trabalho de campo foram entrevistados seis gestores dos três níveis de governo e avaliadas as práticas de regulação nos SAMU do Estado do Rio de Janeiro. A metodologia utilizou o referencial da análise da conduta estratégica da Teoria da Estruturação de Giddens (1984) relacionando as capacidades cognitivas dos agentes e suas estratégias de ação, com as dimensões estruturais. Para o campo, além da teoria de Giddens, busquei no referencial da avaliação, indicadores (incluindo os da política), dialogando com a análise d situação do serviço. A Política de Urgência tece como marcos os financiamento federal, a regionalização, a capacitação dos profissionais, a função do SAMU de observatório da rede; e a gestão por comitês de urgência. A integralidade é proposta como valor, na indicação de utilizar o conceito ampliado de urgência, através da regionalização e da comunicação entre os serviços. A capacitação não foi instituída no estado e os vínculos empregatícios eram precários. Foi constatada a inoperância do Comitê Gestor Nacional de Urgências e a ausência do Comitê Estadual. Não há assistência integrada tendo entre as causas a insuficiência estrutural da rede, representada pela ausência da atenção básica e pela precariedade nos hospitais de referência. Não há produção e utilização de informação e o SAMU não cumpre a função de observatório de saúde. Os três SAMUs têm estruturas diferenciadas. Foram analisados 206 atendimentos e sua categirazação destacou: o SAMU bem sucedido, com práticas de integralidade no seu componente individual e de acesso aos serviços; sua função de observatório de rede, que refletiu o vazio assistencial do PSF e média complexidade e a restrição do acesso hospitalar; a insuficiência de recursos, com uso inadequado de ambulâncias; e demandas não reconhecidas, onde casos de urgência não reconhecida foram recusados. Destaca-se a prevalência da urgência clínica. Conclusão: a legitimação da regulação esteve presente na atitude dos entrevistados e de alguns profissionais nos casos do SAMU bem sucedido. A densidade das propostas documentais foi a vertente facilitadora do recurso estrutural. A mobilização de recursos autoritativos e alocativos mostrou fragilidades. Não houve mudança significativa nas práticas tipicamente excludentes do SUS, mas acreditamos no efeito cumulativo dos pequenos desvios que têm na ética e na solidariedade a base da aplicação do conhecimento técnico.
Resumo:
The public dental services in Brazil were limited, practically, to the basic care, so that the specialized services acted, up to 2002, no more than 3,5% of the total of clinical procedures. That lower offer reveals the difficulty of continuity of the attention, that is, the comprehensiveness in the assistance, particulary, the reference and counter-reference system. Brasil Sorridente search to supply those needs when proposing Speciality Dental's Centers(CEOs Centros de Especialidades Odontológicas, Brazil) to compose the services of average complexity. In 2005, Ministry of Health enabled the three CEOs of Natal, located in the North II, East and West Sanitary Districts. This investigation evaluated the implantation of these CEOs, as support of the family health care teams, in the perspective of organization of the services in assistencial nets in Natal/RN. It was a study of evaluation, with qualitative approach and some quantitative data as contribution. Dentists, users and managers were interviewed to identify and to understand their perceptions, relationships and experiences in the daily of the services. The conceptual base that orientated the investigation was the principle of comprehensiveness, in its operational sense of the hierarchization in health attention levels. The collection of data was done with documental research, direct observation and semi-structured interview. The analysis was accomplished by triangulation of the extracted content from the used techniques and sources of interviewed groups depositions, looking for theoretical-conceptual support in specific bibliography. The results pointed aspects that go away from the comprehensiveness like: low resolution of problems in the basic net; little valorization of the space in the health units; traditional models of access to health services, insufficient offer for some specialties, compromising the reference and counter-reference system; practices centered in procedures in the CEO; bureaucratic directions from basic care to the specialized service; disintegrated and disjointed system among levels of attention; disrespect to the municipal protocol. On the other hand, there is an approach of compreensiveness in situations like: increase of the access and covering in the Family Health Strategy (ESF Estratégia Saúde da Família, Brazil); larger approach between professional and user; tendency to the quantitative and qualitative growth of specialized actions; punctual initiatives of relationships among levels; existence of protocol to guide professionals
Resumo:
Despite increasing interest in pathological and non-pathological dissociation, few researchers have focused on the spiritual experiences involving dissociative states such as mediumship, in which an individual (the medium) claims to be in communication with, or under the control of, the mind of a deceased person. Our preliminary study investigated psychography - in which allegedly "the spirit writes through the medium's hand" - for potential associations with specific alterations in cerebral activity. We examined ten healthy psychographers - five less expert mediums and five with substantial experience, ranging from 15 to 47 years of automatic writing and 2 to 18 psychographies per month - using single photon emission computed tomography to scan activity as subjects were writing, in both dissociative trance and non-trance states. The complexity of the original written content they produced was analyzed for each individual and for the sample as a whole. The experienced psychographers showed lower levels of activity in the left culmen, left hippocampus, left inferior occipital gyrus, left anterior cingulate, right superior temporal gyrus and right precentral gyrus during psychography compared to their normal (non-trance) writing. The average complexity scores for psychographed content were higher than those for control writing, for both the whole sample and for experienced mediums. The fact that subjects produced complex content in a trance dissociative state suggests they were not merely relaxed, and relaxation seems an unlikely explanation for the underactivation of brain areas specifically related to the cognitive processing being carried out. This finding deserves further investigation both in terms of replication and explanatory hypotheses.
Resumo:
Background: The increasing number of children with evolving congenital heart diseases demands greater preparation of professionals and institutions that handle them. Objective: To describe the profile of patients aged over 16 years with congenital heart disease, who have undergone surgery, and analyze the risk factors that predict hospital mortality. Methods: One thousand five hundred twenty patients (mean age 27 +/- 13 years) were operated between January 1986 and December 2010. We performed a descriptive analysis of the epidemiological profile of the study population and analyzed risk factors for hospital mortality, considering the complexity score, the year in which surgery was performed, the procedure performed or not performed by the pediatric surgeon and reoperation. Results: There was a significant increase in the number of cases from the year 2000. The average complexity score was 5.4 and the septal defects represented 45% of cases. Overall mortality was 7.7% and most procedures (973 or 61.9%) with greater complexity were performed by pediatric surgeons. Complexity (OR 1.5), reoperation (OR 2.17) and pediatric surgeon (OR 0.28) were independent risk factors influencing mortality. Multivariate analysis showed that the year in which the surgery was performed (OR 1.03), the complexity (OR 1.44) and the pediatric surgeon (OR 0.28) influenced the result. Conclusion: There is an increasing number of patients aged 16 years which, despite the large number of simple cases, the most complex ones were referred to pediatric surgeons, who had lower mortality, especially in recent years. (Arq Bras Cardiol 2012;98(5):390-397)
Resumo:
The data structure of an information system can significantly impact the ability of end users to efficiently and effectively retrieve the information they need. This research develops a methodology for evaluating, ex ante, the relative desirability of alternative data structures for end user queries. This research theorizes that the data structure that yields the lowest weighted average complexity for a representative sample of information requests is the most desirable data structure for end user queries. The theory was tested in an experiment that compared queries from two different relational database schemas. As theorized, end users querying the data structure associated with the less complex queries performed better Complexity was measured using three different Halstead metrics. Each of the three metrics provided excellent predictions of end user performance. This research supplies strong evidence that organizations can use complexity metrics to evaluate, ex ante, the desirability of alternate data structures. Organizations can use these evaluations to enhance the efficient and effective retrieval of information by creating data structures that minimize end user query complexity.
Resumo:
The schema of an information system can significantly impact the ability of end users to efficiently and effectively retrieve the information they need. Obtaining quickly the appropriate data increases the likelihood that an organization will make good decisions and respond adeptly to challenges. This research presents and validates a methodology for evaluating, ex ante, the relative desirability of alternative instantiations of a model of data. In contrast to prior research, each instantiation is based on a different formal theory. This research theorizes that the instantiation that yields the lowest weighted average query complexity for a representative sample of information requests is the most desirable instantiation for end-user queries. The theory was validated by an experiment that compared end-user performance using an instantiation of a data structure based on the relational model of data with performance using the corresponding instantiation of the data structure based on the object-relational model of data. Complexity was measured using three different Halstead metrics: program length, difficulty, and effort. For a representative sample of queries, the average complexity using each instantiation was calculated. As theorized, end users querying the instantiation with the lower average complexity made fewer semantic errors, i.e., were more effective at composing queries. (c) 2005 Elsevier B.V. All rights reserved.
Resumo:
In this paper we present a data structure which improves the average complexity of the operations of updating and a certain type of retrieving information on an array. The data structure is devised from a particular family of digraphs verifying conditions so that they represent solutions for this problem.
Resumo:
In this paper a constructive method of data structures solving an array maintenance problem is offered. These data structures are defined in terms of a family of digraphs which have previously been defined, representing solutions for this problem. We present as well a prototype of the method in Haskell.
Resumo:
Let V be an array. The range query problem concerns the design of data structures for implementing the following operations. The operation update(j,x) has the effect vj ← vj + x, and the query operation retrieve(i,j) returns the partial sum vi + ... + vj. These tasks are to be performed on-line. We define an algebraic model – based on the use of matrices – for the study of the problem. In this paper we establish as well a lower bound for the sum of the average complexity of both kinds of operations, and demonstrate that this lower bound is near optimal – in terms of asymptotic complexity.
Resumo:
Objectivo: este estudo centrava-se na avaliação da eficácia da intervenção de um farmacêutico na Redução do Grau de Complexidade da Medicação num Lar de Idosos. Métodos: tratou-se de um estudo randomizado controlado. A instituição que serviu para a recolha de dados foi o Lar da Santa Casa da Misericórdia das Alcáçovas, localidade que pertence ao concelho de Viana do Alentejo, Distrito de Évora. Foram usados como amostra, os utentes institucionalizados (n=86), que por randomização foram divididos em grupo de intervenção e de controlo respectivamente. Em Março 2007, o Índice Çomplexidade da Medicação (MRCI), foi usado para estabelecer a linha de partida (baseline). Ocorreu uma sessão informativa com o médico acerca da importância e dos efeitos provocados pelo MRCI obtido. A fase de Intervenção teve início em Maio de 2007, e consistiu em reportar ao médico o MRCI para cada utente, o valor médio do MRCI para o Lar e algumas recomendações para o poder reduzir. Noventa dias após a intervenção, o MRCI voltou a ser avaliado para todos os utentes. Resultados: a média de idades para os 86 utentes era de 83,9 anos, com 66,3o/o de mulheres. Na linha de partida, os utentes usavam 7,8 medicamentos e apresentavam um MRCI de 22,9 (95% Cl 20,1: 25,7). Durante a fase de intervenção, 2 utentes do grupo de intervenção e 5 utentes do grupo de controlo faleceram. Após a intervenção, o número de medicamentos reduziu no grupo de intervenção (p = 0,035), mas não no grupo de controlo (p =0,079). O MACI do grupo de intervenção reduziu de 22,2 para 16,8 (p =0,015); enquanto o MRCI do grupo de controlo reduziu apenas de 23,6 para 20,0 (p =0,091). As três secções do MRCI reduziram significativamente no grupo de intervenção, mas nenhum deles reduziu no grupo de controlo. Conclusão: a intervenção de um farmacêutico pode contribuir para reduzir a complexidade da medicação nos idosos, com uma ligeira redução no número de medicamentos a tomar pelos utentes e sem focalizar a intervenção num aspecto específico do regime terapêutico. ABSTRACT; Methods: Randomized controlled study. Patients (n= 86) institutionalized in nursing home to Santa Casa da Misericórdia das Alcáçovas, Viana do Alentejo, Évora. The patients were randomly assigned to intervention and control groups. ln Mars 2007, Medication Regimen Complexity Index (MRCI) was used to establish a baseline... An informative session with the physician about the importance and effects of regime complexity occurred. lntervention started in May 2007, and consisted in reporting to the physician the complexity of each patient medication regime, with references to the average complexity and some recommendations to reduce it. Ninety days after the intervention, MRCI were evaluated in all the patients. Results: average age of the 86 patients was 83,9 years, with 66,3°/o of females. At the baseline, patients were using 7, 8 medicines, and presented a MRCI = 22,9 (95%CI 20,1 : 25,7). During the intervention phase, 2 intervention patients and 5 control patients dead. After the intervention, the number of medicines reduced in intervention group (p=0,035), but not in the control group (p = 0,079).1ntervention MRCI reduced from 22,2 to 16,8 (p =0,015), while control MRCI reduced only from 23,6 to 20,0 (p =0,091). The three section of the MRCI significantly reduced in the intervention, but none of them in the control group. Conclusions: clinical pharmacist interventions can contribute to reducing the medication regime complexity in elderly, with a slight reduction of the number of medicines taken by the patient, and without focusing the intervention in one specific aspect of the medication regime.
Resumo:
The problem of sensor-network-based distributed intrusion detection in the presence of clutter is considered. It is argued that sensing is best regarded as a local phenomenon in that only sensors in the immediate vicinity of an intruder are triggered. In such a setting, lack of knowledge of intruder location gives rise to correlated sensor readings. A signal-space viewpoint is introduced in which the noise-free sensor readings associated to intruder and clutter appear as surfaces $\mathcal{S_I}$ and $\mathcal{S_C}$ and the problem reduces to one of determining in distributed fashion, whether the current noisy sensor reading is best classified as intruder or clutter. Two approaches to distributed detection are pursued. In the first, a decision surface separating $\mathcal{S_I}$ and $\mathcal{S_C}$ is identified using Neyman-Pearson criteria. Thereafter, the individual sensor nodes interactively exchange bits to determine whether the sensor readings are on one side or the other of the decision surface. Bounds on the number of bits needed to be exchanged are derived, based on communication complexity (CC) theory. A lower bound derived for the two-party average case CC of general functions is compared against the performance of a greedy algorithm. The average case CC of the relevant greater-than (GT) function is characterized within two bits. In the second approach, each sensor node broadcasts a single bit arising from appropriate two-level quantization of its own sensor reading, keeping in mind the fusion rule to be subsequently applied at a local fusion center. The optimality of a threshold test as a quantization rule is proved under simplifying assumptions. Finally, results from a QualNet simulation of the algorithms are presented that include intruder tracking using a naive polynomial-regression algorithm.
Resumo:
This work introduces a complexity measure which addresses some conflicting issues between existing ones by using a new principle - measuring the average amount of symmetry broken by an object. It attributes low (although different) complexity to either deterministic or random homogeneous densities and higher complexity to the intermediate cases. This new measure is easily computable, breaks the coarse graining paradigm and can be straightforwardly generalized, including to continuous cases and general networks. By applying this measure to a series of objects, it is shown that it can be consistently used for both small scale structures with exact symmetry breaking and large scale patterns, for which, differently from similar measures, it consistently discriminates between repetitive patterns, random configurations and self-similar structures
Resumo:
In the multi-view approach to semisupervised learning, we choose one predictor from each of multiple hypothesis classes, and we co-regularize our choices by penalizing disagreement among the predictors on the unlabeled data. We examine the co-regularization method used in the co-regularized least squares (CoRLS) algorithm, in which the views are reproducing kernel Hilbert spaces (RKHS's), and the disagreement penalty is the average squared difference in predictions. The final predictor is the pointwise average of the predictors from each view. We call the set of predictors that can result from this procedure the co-regularized hypothesis class. Our main result is a tight bound on the Rademacher complexity of the co-regularized hypothesis class in terms of the kernel matrices of each RKHS. We find that the co-regularization reduces the Rademacher complexity by an amount that depends on the distance between the two views, as measured by a data dependent metric. We then use standard techniques to bound the gap between training error and test error for the CoRLS algorithm. Experimentally, we find that the amount of reduction in complexity introduced by co regularization correlates with the amount of improvement that co-regularization gives in the CoRLS algorithm.
Resumo:
"Extended Clifford algebras" are introduced as a means to obtain low ML decoding complexity space-time block codes. Using left regular matrix representations of two specific classes of extended Clifford algebras, two systematic algebraic constructions of full diversity Distributed Space-Time Codes (DSTCs) are provided for any power of two number of relays. The left regular matrix representation has been shown to naturally result in space-time codes meeting the additional constraints required for DSTCs. The DSTCs so constructed have the salient feature of reduced Maximum Likelihood (ML) decoding complexity. In particular, the ML decoding of these codes can be performed by applying the lattice decoder algorithm on a lattice of four times lesser dimension than what is required in general. Moreover these codes have a uniform distribution of power among the relays and in time, thus leading to a low Peak to Average Power Ratio at the relays.
Resumo:
It is known that in an OFDM system using Hadamard transform or phase alteration before the IDFT operation can reduce the Peak-to-Average Power Ratio (PAPR). Both these techniques can be viewed as constellation precoding for PAPR reduction. In general, using non-diagonal transforms, like Hadamard transform, increases the ML decoding complexity. In this paper we propose the use of block-IDFT matrices and show that appropriate block-IDFT matrices give lower PAPR as well as lower decoding complexity compared to using Hadamard transform. Moreover, we present a detailed study of the tradeoff between PAPR reduction and the ML decoding complexity when using block-IDFT matrices with various sizes of the blocks.