893 resultados para Data treatment
Resumo:
Purpose: Traditional patient-specific IMRT QA measurements are labor intensive and consume machine time. Calculation-based IMRT QA methods typically are not comprehensive. We have developed a comprehensive calculation-based IMRT QA method to detect uncertainties introduced by the initial dose calculation, the data transfer through the Record-and-Verify (R&V) system, and various aspects of the physical delivery. Methods: We recomputed the treatment plans in the patient geometry for 48 cases using data from the R&V, and from the delivery unit to calculate the “as-transferred” and “as-delivered” doses respectively. These data were sent to the original TPS to verify transfer and delivery or to a second TPS to verify the original calculation. For each dataset we examined the dose computed from the R&V record (RV) and from the delivery records (Tx), and the dose computed with a second verification TPS (vTPS). Each verification dose was compared to the clinical dose distribution using 3D gamma analysis and by comparison of mean dose and ROI-specific dose levels to target volumes. Plans were also compared to IMRT QA absolute and relative dose measurements. Results: The average 3D gamma passing percentages using 3%-3mm, 2%-2mm, and 1%-1mm criteria for the RV plan were 100.0 (σ=0.0), 100.0 (σ=0.0), and 100.0 (σ=0.1); for the Tx plan they were 100.0 (σ=0.0), 100.0 (σ=0.0), and 99.0 (σ=1.4); and for the vTPS plan they were 99.3 (σ=0.6), 97.2 (σ=1.5), and 79.0 (σ=8.6). When comparing target volume doses in the RV, Tx, and vTPS plans to the clinical plans, the average ratios of ROI mean doses were 0.999 (σ=0.001), 1.001 (σ=0.002), and 0.990 (σ=0.009) and ROI-specific dose levels were 0.999 (σ=0.001), 1.001 (σ=0.002), and 0.980 (σ=0.043), respectively. Comparing the clinical, RV, TR, and vTPS calculated doses to the IMRT QA measurements for all 48 patients, the average ratios for absolute doses were 0.999 (σ=0.013), 0.998 (σ=0.013), 0.999 σ=0.015), and 0.990 (σ=0.012), respectively, and the average 2D gamma(5%-3mm) passing percentages for relative doses for 9 patients was were 99.36 (σ=0.68), 99.50 (σ=0.49), 99.13 (σ=0.84), and 98.76 (σ=1.66), respectively. Conclusions: Together with mechanical and dosimetric QA, our calculation-based IMRT QA method promises to minimize the need for patient-specific QA measurements by identifying outliers in need of further review.
Resumo:
Retrospective clinical datasets are often characterized by a relatively small sample size and many missing data. In this case, a common way for handling the missingness consists in discarding from the analysis patients with missing covariates, further reducing the sample size. Alternatively, if the mechanism that generated the missing allows, incomplete data can be imputed on the basis of the observed data, avoiding the reduction of the sample size and allowing methods to deal with complete data later on. Moreover, methodologies for data imputation might depend on the particular purpose and might achieve better results by considering specific characteristics of the domain. The problem of missing data treatment is studied in the context of survival tree analysis for the estimation of a prognostic patient stratification. Survival tree methods usually address this problem by using surrogate splits, that is, splitting rules that use other variables yielding similar results to the original ones. Instead, our methodology consists in modeling the dependencies among the clinical variables with a Bayesian network, which is then used to perform data imputation, thus allowing the survival tree to be applied on the completed dataset. The Bayesian network is directly learned from the incomplete data using a structural expectation–maximization (EM) procedure in which the maximization step is performed with an exact anytime method, so that the only source of approximation is due to the EM formulation itself. On both simulated and real data, our proposed methodology usually outperformed several existing methods for data imputation and the imputation so obtained improved the stratification estimated by the survival tree (especially with respect to using surrogate splits).
Resumo:
In the last few years there has been a heightened interest in data treatment and analysis with the aim of discovering hidden knowledge and eliciting relationships and patterns within this data. Data mining techniques (also known as Knowledge Discovery in Databases) have been applied over a wide range of fields such as marketing, investment, fraud detection, manufacturing, telecommunications and health. In this study, well-known data mining techniques such as artificial neural networks (ANN), genetic programming (GP), forward selection linear regression (LR) and k-means clustering techniques, are proposed to the health and sports community in order to aid with resistance training prescription. Appropriate resistance training prescription is effective for developing fitness, health and for enhancing general quality of life. Resistance exercise intensity is commonly prescribed as a percent of the one repetition maximum. 1RM, dynamic muscular strength, one repetition maximum or one execution maximum, is operationally defined as the heaviest load that can be moved over a specific range of motion, one time and with correct performance. The safety of the 1RM assessment has been questioned as such an enormous effort may lead to muscular injury. Prediction equations could help to tackle the problem of predicting the 1RM from submaximal loads, in order to avoid or at least, reduce the associated risks. We built different models from data on 30 men who performed up to 5 sets to exhaustion at different percentages of the 1RM in the bench press action, until reaching their actual 1RM. Also, a comparison of different existing prediction equations is carried out. The LR model seems to outperform the ANN and GP models for the 1RM prediction in the range between 1 and 10 repetitions. At 75% of the 1RM some subjects (n = 5) could perform 13 repetitions with proper technique in the bench press action, whilst other subjects (n = 20) performed statistically significant (p < 0:05) more repetitions at 70% than at 75% of their actual 1RM in the bench press action. Rate of perceived exertion (RPE) seems not to be a good predictor for 1RM when all the sets are performed until exhaustion, as no significant differences (p < 0:05) were found in the RPE at 75%, 80% and 90% of the 1RM. Also, years of experience and weekly hours of strength training are better correlated to 1RM (p < 0:05) than body weight. O'Connor et al. 1RM prediction equation seems to arise from the data gathered and seems to be the most accurate 1RM prediction equation from those proposed in literature and used in this study. Epley's 1RM prediction equation is reproduced by means of data simulation from 1RM literature equations. Finally, future lines of research are proposed related to the problem of the 1RM prediction by means of genetic algorithms, neural networks and clustering techniques. RESUMEN En los últimos años ha habido un creciente interés en el tratamiento y análisis de datos con el propósito de descubrir relaciones, patrones y conocimiento oculto en los mismos. Las técnicas de data mining (también llamadas de \Descubrimiento de conocimiento en bases de datos\) se han aplicado consistentemente a lo gran de un gran espectro de áreas como el marketing, inversiones, detección de fraude, producción industrial, telecomunicaciones y salud. En este estudio, técnicas bien conocidas de data mining como las redes neuronales artificiales (ANN), programación genética (GP), regresión lineal con selección hacia adelante (LR) y la técnica de clustering k-means, se proponen a la comunidad del deporte y la salud con el objetivo de ayudar con la prescripción del entrenamiento de fuerza. Una apropiada prescripción de entrenamiento de fuerza es efectiva no solo para mejorar el estado de forma general, sino para mejorar la salud e incrementar la calidad de vida. La intensidad en un ejercicio de fuerza se prescribe generalmente como un porcentaje de la repetición máxima. 1RM, fuerza muscular dinámica, una repetición máxima o una ejecución máxima, se define operacionalmente como la carga máxima que puede ser movida en un rango de movimiento específico, una vez y con una técnica correcta. La seguridad de las pruebas de 1RM ha sido cuestionada debido a que el gran esfuerzo requerido para llevarlas a cabo puede derivar en serias lesiones musculares. Las ecuaciones predictivas pueden ayudar a atajar el problema de la predicción de la 1RM con cargas sub-máximas y son empleadas con el propósito de eliminar o al menos, reducir los riesgos asociados. En este estudio, se construyeron distintos modelos a partir de los datos recogidos de 30 hombres que realizaron hasta 5 series al fallo en el ejercicio press de banca a distintos porcentajes de la 1RM, hasta llegar a su 1RM real. También se muestra una comparación de algunas de las distintas ecuaciones de predicción propuestas con anterioridad. El modelo LR parece superar a los modelos ANN y GP para la predicción de la 1RM entre 1 y 10 repeticiones. Al 75% de la 1RM algunos sujetos (n = 5) pudieron realizar 13 repeticiones con una técnica apropiada en el ejercicio press de banca, mientras que otros (n = 20) realizaron significativamente (p < 0:05) más repeticiones al 70% que al 75% de su 1RM en el press de banca. El ínndice de esfuerzo percibido (RPE) parece no ser un buen predictor del 1RM cuando todas las series se realizan al fallo, puesto que no existen diferencias signifiativas (p < 0:05) en el RPE al 75%, 80% y el 90% de la 1RM. Además, los años de experiencia y las horas semanales dedicadas al entrenamiento de fuerza están más correlacionadas con la 1RM (p < 0:05) que el peso corporal. La ecuación de O'Connor et al. parece surgir de los datos recogidos y parece ser la ecuación de predicción de 1RM más precisa de aquellas propuestas en la literatura y empleadas en este estudio. La ecuación de predicción de la 1RM de Epley es reproducida mediante simulación de datos a partir de algunas ecuaciones de predicción de la 1RM propuestas con anterioridad. Finalmente, se proponen futuras líneas de investigación relacionadas con el problema de la predicción de la 1RM mediante algoritmos genéticos, redes neuronales y técnicas de clustering.
Resumo:
La web ha sufrido una drástica transformación en los últimos años, debido principalmente a su popularización y a la enorme cantidad de información que alberga. Debido a estos factores se ha dado el salto de la denominada Web de Documentos, a la Web Semántica, donde toda la información está relacionada con otra. Las principales ventajas de la información enlazada estriban en la facilidad de reutilización, accesibilidad y disponibilidad para ser encontrada por el usuario. En este trabajo se pretende poner de manifiesto la utilidad de los datos enlazados aplicados al ámbito geográfico y mostrar como pueden ser empleados hoy en día. Para ello se han explotado datos enlazados de carácter espacial provenientes de diferentes fuentes, a través de servidores externos o endpoints SPARQL. Además de eso se ha trabajado con un servidor privado capaz de proporcionar información enlazada almacenada en un equipo personal. La explotación de información enlazada se ha implementado en una aplicación web en lenguaje JavaScript, tratando de abstraer totalmente al usuario del tratamiento de los datos a nivel interno de la aplicación. Esta aplicación cuenta además con algunos módulos y opciones capaces de interactuar con las consultas realizadas a los servidores, consiguiendo un entorno más intuitivo y agradable para el usuario. ABSTRACT: In recent years the web has suffered a drastic transformation because of the popularization and the huge amount of stored information. Due to these factors it has gone from Documents web to Semantic web, where the data are linked. The main advantages of Linked Data lie in the ease of his reuse, accessibility and availability to be located by users. The aim of this research is to highlight the usefulness of the geographic linked data and show how can be used at present time. To get this, the spatial linked data coming from several sources have been managed through external servers or also called endpoints. Besides, it has been worked with a private server able to provide linked data stored in a personal computer. The use of linked data has been implemented in a JavaScript web application, trying completely to abstract the internally data treatment of the application to make the user ignore it. This application has some modules and options that are able to interact with the queries made to the servers, getting a more intuitive and kind environment for users.
Resumo:
The purpose of this review is to integrate and summarize specific measurement topics (instrument and metric choice, validity, reliability, how many and what types of days, reactivity, and data treatment) appropriate to the study of youth physical activity. Research quality pedometers are necessary to aid interpretation of steps per day collected in a range of young populations under a variety of circumstances. Steps per day is the most appropriate metric choice, but steps per minute can be used to interpret time-in-intensity in specifically delimited time periods (e.g., physical education class). Reported intraclass correlations (ICC) have ranged from .65 over 2 days (although higher values also have been reported for 2 days) to .87 over 8 days (although higher values have been reported for fewer days). Reported ICCs are lower on weekend days (.59) versus weekdays (.75) and lower over vacation days (.69) versus school days (.74). There is no objective evidence of reactivity at this time. Data treatment includes (a) identifying and addressing missing values, (b) identifying outliers and reducing data appropriately if necessary, and (c) transforming the data as required in preparation for inferential analysis. As more pedometry studies in young populations are published, these preliminary methodological recommendations should be modified and refined.
Resumo:
Despite their ecological significance as decomposers and their evolutionary significance as the most speciose eusocial insect group outside the Hymenoptera, termite (Blattodea: Termitoidae or Isoptera) evolutionary relationships have yet to be well resolved. Previous morphological and molecular analyses strongly conflict at the family level and are marked by poor support for backbone nodes. A mitochondrial (mt) genome phylogeny of termites was produced to test relationships between the recognised termite families, improve nodal support and test the phylogenetic utility of rare genomic changes found in the termite mt genome. Complete mt genomes were sequenced for 7 of the 9 extant termite families with additional representatives of each of the two most speciose families Rhinotermitidae (3 of 7 subfamilies) and Termitidae (3 of 8 subfamilies). The mt genome of the well supported sister group of termites, the subsocial cockroach Cryptocercus, was also sequenced. A highly supported tree of termite relationships was produced by all analytical methods and data treatment approaches, however the relationship of the termites + Cryptocercus clade to other cockroach lineages was highly affected by the strong nucleotide compositional bias found in termites relative to other dictyopterans. The phylogeny supports previously proposed suprafamilial termite lineages, the Euisoptera and Neoisoptera, a later derived Kalotermitidae as sister group of the Neoisoptera and a monophyletic clade of dampwood (Stolotermitidae, Archotermopsidae) and harvester termites (Hodotermitidae). In contrast to previous termite phylogenetic studies, nodal supports were very high for family-level relationships within termites. Two rare genomic changes in the mt genome control region were found to be molecular synapomorphies for major clades. An elongated stem-loop structure defined the clade Polyphagidae + (Cryptocercus + termites), and a further series of compensatory base changes in this stem loop is synapomorphic for the Neoisoptera. The complicated repeat structures first identified in Reticulitermes, composed of short (A-type) and long (B-type repeats) defines the clade Heterotermitinae + Termitidae, while the secondary loss of A-type repeats is synapomorphic for the non-macrotermitine Termitidae.
Resumo:
We wished to determine whether changing our centre's practice of using Acticoat instead of Silvazine as our first-line burns dressing provided a better standard of care in terms of efficacy, cost and ease of use. A retrospective cohort study was performed examining 328 Silvazine treated patients from January 2000 to June 2001 and 241 Acticoat treated patients from July 2002 to July 2003. During those periods the respective dressings were used exclusively. There was no significant difference in age, %BSA and mechanism of burn between the groups. In the Silvazine group, 25.6% of children required grafting compared to 15.4% in the Acticoat group (p=0.001). When patients requiring grafting were excluded, the time taken for re-epithelialisation in the Acticoat group (14.9 days) was significantly less than that for the Silvazine group (18.3 days), p=0.047. There were more wounds requiring long term scar management in the Silvazine group (32.6%) compared to the Acticoat group (29.5%), however this was not significant. There was only one positive blood culture in each group, indicating that both Silvazine and Acticoat are potent antimicrobial agents. The use of Acticoat as our primary burns dressing has dramatically changed our clinical practice. Inpatients are now only 18% of the total admissions, with the vast majority of patients treated on an outpatient basis. In terms of cost, Acticoat was demonstrated to be less expensive over the treatment period than Silvazine . We have concluded that Acticoat is a safe, cost-effective, efficacious dressing that reduces the time for re-epithelialisation and the requirement for grafting and long term scar management, compared to Silvazine.
Resumo:
468 p.
Resumo:
O objeto deste estudo é a percepção dos egressos da Faculdade de Enfermagem da Universidade do Estado do Rio de Janeiro (ENF/UERJ) sobre o mundo do trabalho e a influência da formação na prática laboral, cujos objetivos foram: I) descrever a percepção dos egressos da ENF/UERJ sobre o mundo do trabalho em saúde e enfermagem, considerando o processo de formação na graduação; II) identificar as facilidades e as dificuldades percebidas pelos egressos na prática profissional e suas repercussões para o processo saúde-doença, considerando a configuração do mundo do trabalho; III) analisar os distanciamentos e as aproximações apontados pelos egressos entre a formação na ENF/UERJ e a configuração do mundo do trabalho em saúde e enfermagem; e IV) discutir as situações nas quais os egressos atuam e que se apresentam como real ou potencialmente transformadoras da realidade laboral em que se encontram. Pesquisa de natureza qualitativa, descritiva e exploratória, desenvolvida na ENF/UERJ e aprovado pela Plataforma Brasil, sob o número 360.021. Os participantes foram 30 egressos da ENF/UERJ, formados entre o ano 2000 no primeiro semestre até o ano de 2010 no segundo semestre. A coleta de dados ocorreu de dezembro de 2013 até fevereiro de 2014 por meio da entrevista semiestruturada. A técnica utilizada para tratamento dos dados ocorreu por meio da Análise Temática de Conteúdo, que fez emergir quatro categorias empíricas: A dinamicidade e a complexidade do mundo do trabalho contemporâneo e a formação: a ótica do egresso de enfermagem; A dialética do mundo do trabalho: facilidades e adversidades enfrentadas no cotidiano laboral e sua influência no processo saúde-doença; A formação em enfermagem e a interface com o mundo do trabalho: dilemas e desafios a superar; e Significados, habilidades e competências construídas ao longo da formação e suas repercussões na prática laboral. Conclui-se, na perspectiva dos objetivos deste estudo, que os participantes apresentam um ponto de vista crítico e uma visão macro estrutural sobre o mundo do trabalho contemporâneo aproximada da discussão de sociólogos e estudiosos do trabalho. Por conseguinte, pode-se considerar que a formação na ENF/UERJ contribuiu para a construção desta visão crítica, reflexiva e politizada sobre a realidade do trabalho que os egressos vivenciam.
Resumo:
The absorptivities of color elements in a mixture can be obtained by using Gauas' elimination with selection of principal element in matrix to the standards. These values can be applied to flexible tolerance simplex method to give the composition of samples. In the exprimental design and data treatment, an effort was made to minimize the errors of results according the principal of optimization. When the difference of absorptivities of color material is significant to the exprimental error, the pr...
Resumo:
Performance of comprehensive two-dimensional liquid chromatography system is greatly improved than we reported previously by using a silica monolithic column as for the second dimensional separation. Due to the increase of the elution speed on the second dimensional monolithic column, the first dimensional column efficiency and analysis rate can be greatly improved as comparing with conventionally second dimensional column. The developed system was applied to analysis of methanol extraction of two umbelliferae herbs Ligusticum chuanxiong Hort. and Angelica sinensis (Oliv.) Diels by using CN column as for the first dimensional separation and a silica monolithic ODS column for the second dimensional separation, and the obtained three-dimensional chromatograms were treated by normalization of peak heights with the value of the highest peak or setting a certain value using a software written in-house. It was observed that much more peaks for low-abundant components in TCM extract can clearly be detected here than we reported before, due to the large difference for the amount of components in TCMs' extract. With the above improvements in separation performance and data treatment, totally about 120 components in methanol extraction of Rhizoma chuanxiong and 100 in A. sinensis were separated with UV detection within 130 min. This result meant that both the number of peaks detected increase twice but the analysis time decease twice if comparing with the previously reported result. (c) 2005 Published by Elsevier B.V.
Resumo:
Tese de Doutoramento apresentada à Universidade Fernando Pessoa como parte dos requisitos para obtenção do grau de Doutror em Ciências da Terra.
Resumo:
O carvão e outros combustíveis fósseis, continuarão a ser, por décadas, a principal matéria-prima energética para as Centrais Térmicas, não obstante os esforços para, dentro do possível, substituir os combustíveis fósseis por fontes de energia renovável.Tal como está, hoje, bem documentado, a produção de gases com efeito estufa (GEE), designadamente CO2, resulta da combustão dos ditos combustíveis fósseis, sendo que se espera ser possível mitigar substancialmente a emissão de tais gases com a aplicação das chamadas Tecnologias Limpas do Carvão.Há, pois, necessidade de promover o abatimento do CO2 através de Tecnologias de Emissão Zero ou Tecnologias Livres de Carbono, incluindo designadamente a Captura, o Transporte e a Sequestração geológica de CO2 correspondentes ao que é costume designar por Tecnologias CAC (Captação e Armazenamento de Carbono). De facto, tais tecnologias e, designadamente, o armazenamento geológico de CO2 são as únicas que, no estado actual do conhecimento, são capazes de permitir que se cumpram as metas do ambicioso programa da EU para a energia e o ambiente conhecido por “20 20 para 2020” em conjugação com os aspectos económicos das directivas relativas ao Comércio Europeu de Licenças de Emissão – CELE (Directivas 2003/87/EC, 2004/101/EC e 2009/29/EC).A importância do tema está, aliás, bem demonstrada com o facto da Comissão Europeia ter formalmente admitido que as metas supracitadas serão impossíveis de atingir sem Sequestração Geológica de CO2. Esta é, pois, uma das razões de ter sido recentemente publicada a Directiva Europeia 2009/31/EC de 23 de Abril de 2009 expressamente dedicada ao tema do Armazenamento Geológico de CO2.Ora, a questão do armazenamento geológico de CO2 implica, para além das Tecnologias CAC acima mencionadas e da sua viabilização em termos tanto técnicos como económicos, ou seja, neste último aspecto, competitiva com o sistema CELE, também o conhecimento, da percepção pública sobre o assunto. Isto é, a praticabilidade das Tecnologias CAC implica que se conheça a opinião pública sobre o tema e, naturalmente, que face a esta realidade se prestem os esclarecimentos necessários como, aliás, é reconhecido na própria Directiva Europeia 2009/31/EC.Dado que a Fundação Fernando Pessoa / Universidade Fernando Pessoa através do seu Centro de Investigação em Alterações Globais, Energia, Ambiente e Bioengenharia – CIAGEB tem ultimado um Projecto de Engenharia relativo à Sequestração Geológica de CO2 nos Carvões (Metantracites) da Bacia Carbonífera do Douro – o Projecto COSEQ, preocupou-se naturalmente, desde o início, com o lançamento de inquéritos de percepção da opinião pública sobre o assunto.Tal implicou, nesta fase, a tradução para português e o lançamento do inquérito europeu ACCSEPT que não tinha sido ainda formalmente lançado de forma generalizada entre nós. Antes, porém, de lançar publicamente tal inquérito – o que está actualmente já em curso – resolveu-se testar o método de lançamento, a recolha de dados e o seu tratamento com uma amostra correspondente ao que se designou por Comunidade Fernando Pessoa, i.e. o conjunto de docentes, discentes, funcionários e outras pessoas relacionadas com a Universidade Fernando Pessoa (cerca de 5000 individualidades).Este trabalho diz, precisamente, respeito à preparação, lançamento e análise dos resultados do dito inquérito Europeu ACCSEPT a nível da Comunidade Fernando Pessoa. Foram recebidas 525 respostas representando 10,5% da amostra. A análise de resultados foi sistematicamente comparada com os obtidos nos outros países europeus, através do projecto ACCSEPT e, bem assim, com os resultados obtidos num inquérito homólogo lançado no Brasil. The use of coal, and other fossil fuels, will remain for decades as the main source of energy for power generation, despite the important efforts made to replace, as far as possible, fossil fuels with renewable power sources.As is well documented, the production of Greenhouse Gases (GHG), mainly CO2, arises primarily from the combustion of fossil fuels. The increasing application of Clean Coal Technologies-CCTs, is expected to mitigate substantially against the emission of such gases.There is consequently a need to promote the CO2 abatement through Zero Emission (Carbon Free) Technologies - ZETs, which includes CO2 capture, transport and geological storage, i.e. the so-called CCS (Carbon, Capture and Storage) technologies. In fact, these technologies are the only ones that are presently able to conform to the ambitious EU targets set out under the “20 20 by 2020” EU energy and environment programme, jointly with the economic aspects of the EU Directives 2003/87/EC, 2004/101/EC and 2009/29/EC concerned with the Greenhouse Gas Emissions Allowance Trading Scheme – ETS scheme. The European Commission formal admission that the referred targets will be impossible to reach without the implementation and contribution of geological storage clearly demonstrate the importance of this particular issue, and for this reason the EC Directive 2009/31/EC of April 23, 2009 on Geological Storage of CO2 was recently published.In considering the technical and economical viabilities of CCS technologies, the latter in competition with the ETS scheme, it is believed that public perception will dictate the success of the development and implementation of CO2 geological storage at a large industrial level. This means that, in order to successfully implement CCS technologies, not only must public opinion be taken into consideration but objective information must also be provided to the public in order to raise subject awareness, as recognized in the referred Directive 2009/31/EC.In this context, the Fernando Pessoa Foundation / University Fernando Pessoa, through its CIAGEB (Global Change, Energy, Environment and Bioengineering) RDID&D Unit, is the sponsor of an Engineering Project for the Geological Sequestration of CO2 in Douro Coalfield Meta-anthracites - the COSEQ Project, and is therefore also engaged in public perception surveys with regards to CCS technologies.At this stage, the original European ACCSEPT inquiry was translated to Portuguese and submitted only to the “Fernando Pessoa Community” - comprising university lecturers, students, other employees, as well as, former students and persons that have a professional or academic relationship with the university (c. 5000 individuals). The results obtained from this first inquiry will be used to improve the survey informatics system in terms of communication, database, and data treatment prior to resubmission of the inquiry to the Portuguese public at large.The present publication summarizes the process and the results obtained from the ACCSEPT survey distributed to the “Fernando Pessoa Community”. 525 replies, representing 10.5% of the sample, have been received and analysed. The assessment of the results was systematically compared with those obtained from other European Countries, as reported by the ACCSEPT inquiry, as well as with those from an identical inquiry launched in Brazil.