48 resultados para Region growing algorithms
em Instituto Politécnico do Porto, Portugal
Resumo:
A doença mental continua imbuída de mitos, preconceitos e estereótipos, apesar da crescente aposta na investigação e na melhoria de tratamento nesta área da saúde. Como consequência, as pessoas com doença mental são discriminadas e estigmatizadas quer pelo público geral e pelos meios de comunicação, quer pelas próprias famílias e pelos profissionais de saúde mental que lhes prestam cuidados. Uma vez que os profissionais de saúde mental estabelecem uma ponte entre a doença e a saúde, espera-se que as suas atitudes e práticas contribuam para o recovery da pessoa com doença mental. No entanto, se os profissionais também apresentarem atitudes e crenças estigmatizantes face à doença mental, este processo reabilitativo pode ficar comprometido. Nesse sentido, e perante as lacunas de investigação nesta área, este trabalho tem como objectivo explorar e clarificar a presença ou ausência de atitudes estigmatizantes dos profissionais de saúde mental e, quando presentes, como se caracterizam. Para tal realizaramse 24 entrevistas de carácter qualitativo a profissionais de saúde mental que trabalham em três instituições na região do Porto, nomeadamente num serviço de psiquiatria de um hospital geral, num hospital especializado e em estruturas comunitárias. A análise do material discursivo recolhido junto de Assistentes Sociais, Enfermeiros, Médicos Psiquiatras, Psicólogos e Terapeutas Ocupacionais evidencia a presença de crenças e atitudes de carácter estigmatizante face à doença mental, independentemente da idade, formação ou local onde exercem funções, salvo escassos aspectos onde parece haver influência da idade e da profissão. Significa isto que é provável que as variações de atitudes dos profissionais sejam fundamentalmente consequência das suas características pessoais.
Resumo:
In the present paper we will consider strategies of innovation, risk and proactivity as entre/ intrapreneurship strategies. This study was done in a Portuguese and in a Polish region. In Portugal the region was Vale do Sousa, located in the northern Portugal. The Polish region was Lublin Voivodeship and it is situated in the south-eastern part of the country. The study focused on Industrial and Construction sectors. In order to get a valid sample, a group of 251 firms were analysed in Portugal, and 215 in Poland. However, the minimum sample size in Poland should be 323. Since this is a work in progress, we are aiming for this number of questionnaires. Each strategy was analysed individually for both regions and the results pointed to a lack of culture of entrepreneurship in firms’ management. Only Proactivity presented a positive result in firms’ management. Polish firms tend to be more innovative and more risk takers, while in proactivity Portuguese ones present a slightly higher result. Combining the strategy results, it was possible to identify that 61.2% of Portuguese firms present a low level of entrepreneurship, while 60% of Polish firms present a moderate level. Considering intrapreneurship good levels, while Portugal account for 5.2% this figure is 19.1% in Poland.
Resumo:
In the present paper we analyzed the behavior of firms in the construction and manufacturing sectors, located in the region of Vale do Sousa, in the north of Portugal. From the literature, even existing some disagreements, it is possible to conclude that planning is crucial for firms survival and growth. Cooperation is another aspect that the literature presents as an important factor for firms sustainability. It also plays a major role in competition, since firms are adopting coopetition strategies. By studying a sample of 251 firms, it was possible to realize, that the majority started their business without a formal planning, and they keep going without using it. In cooperation aspects, there is a lack of cooperation. It was possible to verify, that existing cooperation has some evidence but at a vertical level. These vertical relations were also identified in stakeholder’s involvement.
Resumo:
Introduction: Image resizing is a normal feature incorporated into the Nuclear Medicine digital imaging. Upsampling is done by manufacturers to adequately fit more the acquired images on the display screen and it is applied when there is a need to increase - or decrease - the total number of pixels. This paper pretends to compare the “hqnx” and the “nxSaI” magnification algorithms with two interpolation algorithms – “nearest neighbor” and “bicubic interpolation” – in the image upsampling operations. Material and Methods: Three distinct Nuclear Medicine images were enlarged 2 and 4 times with the different digital image resizing algorithms (nearest neighbor, bicubic interpolation nxSaI and hqnx). To evaluate the pixel’s changes between the different output images, 3D whole image plot profiles and surface plots were used as an addition to the visual approach in the 4x upsampled images. Results: In the 2x enlarged images the visual differences were not so noteworthy. Although, it was clearly noticed that bicubic interpolation presented the best results. In the 4x enlarged images the differences were significant, with the bicubic interpolated images presenting the best results. Hqnx resized images presented better quality than 4xSaI and nearest neighbor interpolated images, however, its intense “halo effect” affects greatly the definition and boundaries of the image contents. Conclusion: The hqnx and the nxSaI algorithms were designed for images with clear edges and so its use in Nuclear Medicine images is obviously inadequate. Bicubic interpolation seems, from the algorithms studied, the most suitable and its each day wider applications seem to show it, being assumed as a multi-image type efficient algorithm.
Resumo:
Introduction: A major focus of data mining process - especially machine learning researches - is to automatically learn to recognize complex patterns and help to take the adequate decisions strictly based on the acquired data. Since imaging techniques like MPI – Myocardial Perfusion Imaging on Nuclear Cardiology, can implicate a huge part of the daily workflow and generate gigabytes of data, there could be advantages on Computerized Analysis of data over Human Analysis: shorter time, homogeneity and consistency, automatic recording of analysis results, relatively inexpensive, etc.Objectives: The aim of this study relates with the evaluation of the efficacy of this methodology on the evaluation of MPI Stress studies and the process of decision taking concerning the continuation – or not – of the evaluation of each patient. It has been pursued has an objective to automatically classify a patient test in one of three groups: “Positive”, “Negative” and “Indeterminate”. “Positive” would directly follow to the Rest test part of the exam, the “Negative” would be directly exempted from continuation and only the “Indeterminate” group would deserve the clinician analysis, so allowing economy of clinician’s effort, increasing workflow fluidity at the technologist’s level and probably sparing time to patients. Methods: WEKA v3.6.2 open source software was used to make a comparative analysis of three WEKA algorithms (“OneR”, “J48” and “Naïve Bayes”) - on a retrospective study using the comparison with correspondent clinical results as reference, signed by nuclear cardiologist experts - on “SPECT Heart Dataset”, available on University of California – Irvine, at the Machine Learning Repository. For evaluation purposes, criteria as “Precision”, “Incorrectly Classified Instances” and “Receiver Operating Characteristics (ROC) Areas” were considered. Results: The interpretation of the data suggests that the Naïve Bayes algorithm has the best performance among the three previously selected algorithms. Conclusions: It is believed - and apparently supported by the findings - that machine learning algorithms could significantly assist, at an intermediary level, on the analysis of scintigraphic data obtained on MPI, namely after Stress acquisition, so eventually increasing efficiency of the entire system and potentially easing both roles of Technologists and Nuclear Cardiologists. In the actual continuation of this study, it is planned to use more patient information and significantly increase the population under study, in order to allow improving system accuracy.
Resumo:
Da crescente necessidade de alimentos e da necessidade de travar a destruição de culturas por animais e insectos foram sintetizados os pesticidas. Por entre uma vasta gama de pesticidas alguns são desreguladores endócrinos o que se traduz num perigo para a saúde humana pois pode despoletar alterações nos seres vivos mesmo em concentrações muito baixas. Devido a diversos factores nomeadamente a lixiviação, ventos e outros agentes ambientais assim como a presença de terrenos agrícolas junto a diversos rios, estes encontram-se contaminados com pesticidas desreguladores endócrinos. O objectivo deste trabalho foi avaliar quais os pesticidas desreguladores endócrinos presentes nas águas de rios da região Norte. Neste trabalho utilizou-se a técnica de microextracção em fase sólida recorrendo a uma fibra de PDMS (100 μm). Para tal as condições de optimização foram testadas nomeadamente a quantidade de NaCl, quantidade de metanol, temperatura do injector e tempo de exposição. Os parâmetros obtidos óptimos foram 0 % de NaCl, tempo de exposição de 45 min, temperatura do injector de 260 ºC e 2,5 % de metanol. Conseguiu-se a separação de todos os pesticidas desreguladores endócrinos com a seguinte programação temperaturas: inicio a 60ºC por um minuto seguido de um aumento de 20 °C/min até aos 200 °C onde permanece por um minuto e de seguida um aumento de 5 °C/min até aos 245 °C onde permanece por 40 min. Fizeram-se curvas de calibração entre 0,01 μg/L e 10 μg/L. Constatou-se no entanto uma falta de reprodutibilidade entre as injecções utilizando esta técnica. Os rios analisados foram o Rio Douro, Rio Tâmega, Ria de Aveiro, Rio Lima, Rio Minho, Rio Sousa, Rio Águeda, Rio Cávado e Rio Leça. No Rio Tâmega foram encontrados os seguintes pesticidas: diazinão, α-HCH, β-HCH, δ-HCH, lindano, HCB, simazina/ atrazina, vinclozolina, alacloro, 2,4-D, malatião, aldrina, bifentrina, metoxicloro e fenvalerato. No rio Douro estão presentes HCB, simazina/ atrazina, vinclozolina, 2,4-D, malatião, aldrina, fenvalerato e deltametrina. No rio Lima encontra-se diazinão, α-HCH, δ-HCH, 2,4-D, HCB, vinclozolina, lindano, simazina/atrazina, alacloro, malatião, aldrina, fenvalerato e deltametrina. No rio Sousa os pesticidas encontrados foram: diazinão, HCB, aldrina, α-HCH, β-HCH, δ-HCH, lindano, simazina/ atrazina, 2,4-D, cipermetrina, alacloro, fenvalerato e malatião. No rio Cávado estão presentes o diazinão, α-HCH, β-HCH, δ-HCH, lindano, HCB, 2,4-D, malatião, metoxicloro, cipermetrina e o fenvalerato. Na ria de Aveiro encontrou-se o diazinão, α-HCH, β-HCH, δ-HCH, lindano, HCB, simazina/atrazina, 2,4-D, Malatião e aldrina. No rio Águeda estão presentes o diazinão, HCB, 2,4-D, aldrina e malatião. E por último no rio Leça esta presente o diazinão, 2,4-D, alacloro, malatião, aldrina, cipermetrina e fenvalerato. A importância deste trabalho reside na demonstração da presença destes pesticidas, desreguladores endócrinos nas águas superficiais da região Norte.
Resumo:
Every year European citizens become victims of devastating fires, which are especially disastrous for Southern European countries. Apart from the numerous health and economic consequences, fires generate hazardous pollutants that are introduced into the environment, thus representing serious risks for public health. In that regard, particulate matter (PM) is of amajor concern. Thus, the objectives of thisworkwere to characterize the trend of forest fire occurrences and burnt area during the period of 2005 and 2010 and to study the influence of forest fires on levels of particulatematter PM10 and PM2.5. In 2010, 22,026 forest fires occurred in Portugal. The northern region was the most affected by forest fires, with 27% of occurrences in Oporto district. The annual means of PM10 and PM2.5 concentrations at two urban background sites were 25±14 μg m−3 and 8.2±4.9 μg m−3, and 17±13 μg m−3 and 7.3±5.9 μg m−3, respectively. At both sites the highest levels of PMfractionswere observed during July and August of 2010, corresponding to the periods when majority (66%) of forest fires occurred. Furthermore, PM10 daily limit at the two sites was exceeded during 20 and 5 days, respectively; 56%, and respectively 60% of those exceedances occurred during the forest fire season. Considering that the risks of forest fire ignition and severity are enhanced with elevated temperatures, the climate change might increase the environmental impacts of forest fires.
Resumo:
The paper formulates a genetic algorithm that evolves two types of objects in a plane. The fitness function promotes a relationship between the objects that is optimal when some kind of interface between them occurs. Furthermore, the algorithm adopts an hexagonal tessellation of the two-dimensional space for promoting an efficient method of the neighbour modelling. The genetic algorithm produces special patterns with resemblances to those revealed in percolation phenomena or in the symbiosis found in lichens. Besides the analysis of the spacial layout, a modelling of the time evolution is performed by adopting a distance measure and the modelling in the Fourier domain in the perspective of fractional calculus. The results reveal a consistent, and easy to interpret, set of model parameters for distinct operating conditions.
Resumo:
To avoid additional hardware deployment, indoor localization systems have to be designed in such a way that they rely on existing infrastructure only. Besides the processing of measurements between nodes, localization procedure can include the information of all available environment information. In order to enhance the performance of Wi-Fi based localization systems, the innovative solution presented in this paper considers also the negative information. An indoor tracking method inspired by Kalman filtering is also proposed.
Resumo:
Consider the problem of assigning real-time tasks on a heterogeneous multiprocessor platform comprising two different types of processors — such a platform is referred to as two-type platform. We present two linearithmic timecomplexity algorithms, SA and SA-P, each providing the follow- ing guarantee. For a given two-type platform and a given task set, if there exists a feasible task-to-processor-type assignment such that tasks can be scheduled to meet deadlines by allowing them to migrate only between processors of the same type, then (i) using SA, it is guaranteed to find such a feasible task-to- processor-type assignment where the same restriction on task migration applies but given a platform in which processors are 1+α/2 times faster and (ii) SA-P succeeds in finding 2 a feasible task-to-processor assignment where tasks are not allowed to migrate between processors but given a platform in which processors are 1+α/times faster, where 0<α≤1. The parameter α is a property of the task set — it is the maximum utilization of any task which is less than or equal to 1.
Resumo:
In real-time systems, there are two distinct trends for scheduling task sets on unicore systems: non-preemptive and preemptive scheduling. Non-preemptive scheduling is obviously not subject to any preemption delay but its schedulability may be quite poor, whereas fully preemptive scheduling is subject to preemption delay, but benefits from a higher flexibility in the scheduling decisions. The time-delay involved by task preemptions is a major source of pessimism in the analysis of the task Worst-Case Execution Time (WCET) in real-time systems. Preemptive scheduling policies including non-preemptive regions are a hybrid solution between non-preemptive and fully preemptive scheduling paradigms, which enables to conjugate both world's benefits. In this paper, we exploit the connection between the progression of a task in its operations, and the knowledge of the preemption delays as a function of its progression. The pessimism in the preemption delay estimation is then reduced in comparison to state of the art methods, due to the increase in information available in the analysis.
Resumo:
In embedded systems, the timing behaviour of the control mechanisms are sometimes of critical importance for the operational safety. These high criticality systems require strict compliance with the offline predicted task execution time. The execution of a task when subject to preemption may vary significantly in comparison to its non-preemptive execution. Hence, when preemptive scheduling is required to operate the workload, preemption delay estimation is of paramount importance. In this paper a preemption delay estimation method for floating non-preemptive scheduling policies is presented. This work builds on [1], extending the model and optimising it considerably. The preemption delay function is subject to a major tightness improvement, considering the WCET analysis context. Moreover more information is provided as well in the form of an extrinsic cache misses function, which enables the method to provide a solution in situations where the non-preemptive regions sizes are small. Finally experimental results from the implementation of the proposed solutions in Heptane are provided for real benchmarks which validate the significance of this work.
Resumo:
In this paper we discuss challenges and design principles of an implementation of slot-based tasksplitting algorithms into the Linux 2.6.34 version. We show that this kernel version is provided with the required features for implementing such scheduling algorithms. We show that the real behavior of the scheduling algorithm is very close to the theoretical. We run and discuss experiments on 4-core and 24-core machines.