972 resultados para Low Level light Therapy
Resumo:
The clothing sector in Portugal is still seen, in many aspects as a traditional sector with some average characteristics, such as: low level of qualifications, less flexible labour legislation and stronger unionisation, very low salaries and low capability of investment in innovation and new technology. Is, nevertheless, a very important sector in terms of labour market, with increased weight in the exporting structure. Globalisation and delocalisation are having a strong impact in the organisation of work and in occupational careers in the sector. With the pressure of global competitiveness in what concerns time and prices, very few companies are able to keep a position in the market without changes in organisation of work and workers. And those that can perform good responses to such challenges are achieving a better economical stability. The companies have found different ways to face this reality according to size, capital and position. We could find two main paths: one where companies outsource a part or the entire production to another territory (for example, several manufacturing tasks), close and/or dismissal the workers. Other path, where companies up skilled their capacities investing, for example, in design, workers training, conception and introduction of new or original products. This paper will present some results from the European project WORKS – Work organisation and restructuring in the knowledge society (6th Framework Programme), focusing the Portuguese case studies in several clothing companies in what concern implications of global context for the companies in general and for the workers in particular, in a comparative analysis with some other European countries.
Resumo:
Workflows have been successfully applied to express the decomposition of complex scientific applications. This has motivated many initiatives that have been developing scientific workflow tools. However the existing tools still lack adequate support to important aspects namely, decoupling the enactment engine from workflow tasks specification, decentralizing the control of workflow activities, and allowing their tasks to run autonomous in distributed infrastructures, for instance on Clouds. Furthermore many workflow tools only support the execution of Direct Acyclic Graphs (DAG) without the concept of iterations, where activities are executed millions of iterations during long periods of time and supporting dynamic workflow reconfigurations after certain iteration. We present the AWARD (Autonomic Workflow Activities Reconfigurable and Dynamic) model of computation, based on the Process Networks model, where the workflow activities (AWA) are autonomic processes with independent control that can run in parallel on distributed infrastructures, e. g. on Clouds. Each AWA executes a Task developed as a Java class that implements a generic interface allowing end-users to code their applications without concerns for low-level details. The data-driven coordination of AWA interactions is based on a shared tuple space that also enables support to dynamic workflow reconfiguration and monitoring of the execution of workflows. We describe how AWARD supports dynamic reconfiguration and discuss typical workflow reconfiguration scenarios. For evaluation we describe experimental results of AWARD workflow executions in several application scenarios, mapped to a small dedicated cluster and the Amazon (Elastic Computing EC2) Cloud.
Resumo:
OBJECTIVE To evaluate factors associated with users’ satisfaction in the Tuberculosis Control Program. METHODS A cross-sectional study of 295 patients aged ≥ 18 years, with two or more outpatient visits in the Tuberculosis Control Program, in five cities in the metropolitan region of Rio de Janeiro, RJ, Southeastern Brazil, in 2010. Considering an estimated population of 4,345 patients, the sampling plan included 15 health care units participating in the program, divided into two strata: units in Rio de Janeiro City, selected with probability proportional to the monthly average number of outpatient visits, and units in the other four cities. In the units, four temporal clusters of five patients each were selected with equal probability, totaling 300 patients. A questionnaire investigating the users’ clinical and sociodemographic variables and aspects of care and service in the program relevant to user satisfaction was applied to the patients. Descriptive statistics about users and their satisfaction with the program were obtained, and the effects of factors associated with satisfaction were estimated. RESULTS Patients were predominantly males (57.7%), with a mean age of 40.9 and with low level of schooling. The mean treatment time was 4.1 months, mostly self-administered (70.4%). Additionally, 25.8% had previously been treated for tuberculosis. There was a high level of satisfaction, especially regarding medication provision, and respect to patients by the health professionals. Patients who were younger (≤ 30), those on self-administered treatment, and with graduate level, showed less satisfaction. Suggestions to improve the services include having more doctors (70.0%), and offering exams in the same place of attendance (55.1%). CONCLUSIONS Patient satisfaction with the Tuberculosis Control Program was generally high, although lower among younger patients, those with university education and those on self-administered treatment. The study indicates the need for changes to structural and organizational aspects of care, and provides practical support for its improvement.
Resumo:
Present study develops and implements a specific methodology for the assessment of health risks derived from occupational exposure of workers to ionizing radiation in the fertilizer manufacturing industry. Negative effects on the health of exposed workers are identified, according to the types and levels of exposure to which they are subject, namely an increase of the risk of cancer even with long term exposure to low level radiation. Ionizing radiation types, methods and measuring equipment are characterized. The methodology developed in a case study of a phosphate fertilizer industry is applied, assessing occupational exposure to ionizing radiation caused by external radiation and the inhalation of radioactive gases and dust.
Resumo:
OBJECTIVE To estimate the incidence and identify risk factors for intimate partner violence during postpartum.METHODS This prospective cohort study was conducted with women, aged between 18-49 years, enrolled in the Brazilian Family Health Strategy in Recife, Northeastern Brazil, between 2005 and 2006. Of the 1.057 women interviewed during pregnancy and postpartum, 539 women, who did not report violence before or during pregnancy, were evaluated. A theoretical-conceptual framework was built with three levels of factors hierarchically ordered: women’s and partners’ sociodemografic and behavioral characteristics, and relationship dynamics. Incidence and risk factors of intimate partner violence were estimated by Poisson Regression.RESULTS The incidence of violence during postpartum was 9.3% (95%CI 7.0;12.0). Isolated psychological violence was the most common (4.3%; 95%CI 2.8;6.4). The overlapping of psychological with physical violence occurred at 3.3% (95%CI 2.0;5.3) and with physical and/or sexual in almost 2.0% (95%CI 0.8;3.0) of cases. The risk of partner violence during postpartum was increased for women with a low level of education (RR = 2.6; 95%CI 1.3;5.4), without own income (RR = 1.7; 95%CI 1.0;2.9) and those who perpetrated physical violence against their partner without being assaulted first (RR = 2.0; 95%CI 1.2;3.4), had a very controlling partner (RR = 2.5; 95%CI 1.1;5.8), and had frequent fights with their partner (RR = 1.7; 95%CI 1.0;2.9).CONCLUSIONS The high incidence of intimate partner violence during postpartum and its association with aspects of the relationship’s quality between the couple, demonstrated the need for public policies that promote conflict mediation and enable forms of empowerment for women to address the cycle of violence.
Resumo:
Workflows have been successfully applied to express the decomposition of complex scientific applications. However the existing tools still lack adequate support to important aspects namely, decoupling the enactment engine from tasks specification, decentralizing the control of workflow activities allowing their tasks to run in distributed infrastructures, and supporting dynamic workflow reconfigurations. We present the AWARD (Autonomic Workflow Activities Reconfigurable and Dynamic) model of computation, based on Process Networks, where the workflow activities (AWA) are autonomic processes with independent control that can run in parallel on distributed infrastructures. Each AWA executes a task developed as a Java class with a generic interface allowing end-users to code their applications without low-level details. The data-driven coordination of AWA interactions is based on a shared tuple space that also enables dynamic workflow reconfiguration. For evaluation we describe experimental results of AWARD workflow executions in several application scenarios, mapped to the Amazon (Elastic Computing EC2) Cloud.
Resumo:
Male mice (NMRI strain) of 3 and 5 g were inoculated i. p. with 8 x 10(6) and 9 x 10(4) metatrypomastigotes/g harvested from a 12-day-old LIT culture of Trypanosoma rangeli of the "Dog-82" strain. At regular intervals after inoculation, the animals were sacrificed and portions of heart, liver, spleen, lung, thigh, kidney, stomach, intestine, brain, sternum, and vertebral column were embedded in paraffin, sectioned, and stained with haematoxylin-eosin and Giemsa colophonium. Pathology was encountered in the first five tissues cited above. The subcutaneous, periosteal, interstitial, and peribronchial connective tissues, and later the muscle cells of the heart, were heavily parasitized by amastigotes and trypomastigotes. The possible reasons for the decrease in tissue parasitosis at the same time that the parasitemia is reaching its peak, and for the low level of inflammation in the parasitized tissues, are discussed. The observations of other workers, as well as the results described here, indicate that certain strains of T. rangeli under certain conditions may well cause pathological alterations in mammals.
Resumo:
Dissertação apresentada na Faculdade de Ciências e Tecnologia da Universidade Nova de Lisboa para obtenção do grau de Mestre em Engenharia Electrotécnica e de Computadores
Resumo:
The intent of this dissertation is to review relevant existing management systems and chemical industry initiatives to identify synergies, overlaps and gaps with Sustainability best practices, to map the barriers to the incorporation of Sustainability and formulate recommendations to facilitate execution of Sustainability practices within existing management systems. A chemical industry Sustainability survey was conducted through APEQ, the Portuguese association of chemical companies, which constitutes the first baseline on the topic for this national industry association. The commonly used international standards and the Responsible Care® (RC) initiative were cross-referenced against the United Nations Global Compact Assessment Tool. Guidance on how to incorporate Sustainability into a company‘s modus operandi was collapsed into Sustainability Playbooks. The survey revealed that 73% of the APEQ member companies that participated in the survey have a Sustainability Plan. Both large and small/medium APEQ member companies see the market not willing to pay extra for ‗greener‘ products as one of the main barriers. APEQ large enterprise see complexity of implementation and low return on investment as the other most significant barriers while small/medium enterprise respond that the difficulty to predict customer sustainability needs is the other most significant barrier. Amongst many other insights from this survey reported to APEQ, Life Cycle Assessment practices were found to have a low level of implementation and were also considered of low importance, thus identifying a very important opportunity in Sustainability practices to be addressed by APEQ. Two hundred and seventy three assessment points from United Nations Global Compact Assessment Tool plus five additional items were cross-referenced with international standard requirements. With the authorization of the intellectual property owners, the United Nations Global Compact Assessment Tool was modified to introduce actionable recommendations for each gap identified by management standard. This tool was automated to output specific recommendations for 63 possible combinations after simply selecting from a list of commonly used management standards and the RC initiative. Finally this modified tool was introduced into Playbooks for Incorporation of Sustainability at two levels: a ―Get Started Playbook‖ for beginners or small/medium size enterprise and an ―Advanced Playbook‖ as a second advancement stage or for large enterprise.
Resumo:
Pultruded products are being targeted by a growing demand due to its excellent mechanical properties and low chemical reactivity, ensuring a low level of maintenance operations and allowing an easier assembly operation process than equivalent steel bars. In order to improve the mechanical drawing process and solve some acoustic and thermal insulation problems, pultruded pipes of glass fibre reinforced plastics (GFRF) can be filled with special products that increase their performance regarding the issues previously referred. The great challenge of this work was drawing a new equipment able to produce pultruded pipes filled with cork or polymeric pre-shaped bars as a continuous process. The project was carried out successfully and the new equipment was built and integrated in the pultrusion equipment already existing, allowing to obtain news products with higher added-value in the market, covering some needs previously identified in the field of civil construction.
Resumo:
Dissertação apresentada na Faculdade de Ciências e Tecnologia da Universidade Nova de Lisboa para obtenção do grau de Mestre em Engenharia Informática
Resumo:
Dissertação apresentada ao Instituto Politécnico do Porto-Instituto Superior de Contabilidade e Administração do Porto, para obtenção do Grau de Mestre em Empreendedorismo e Internacionalização, sob orientação de Orlando Manuel Martins Marques de Lima Rua, PhD
Resumo:
This paper presents a new parallel implementation of a previously hyperspectral coded aperture (HYCA) algorithm for compressive sensing on graphics processing units (GPUs). HYCA method combines the ideas of spectral unmixing and compressive sensing exploiting the high spatial correlation that can be observed in the data and the generally low number of endmembers needed in order to explain the data. The proposed implementation exploits the GPU architecture at low level, thus taking full advantage of the computational power of GPUs using shared memory and coalesced accesses to memory. The proposed algorithm is evaluated not only in terms of reconstruction error but also in terms of computational performance using two different GPU architectures by NVIDIA: GeForce GTX 590 and GeForce GTX TITAN. Experimental results using real data reveals signficant speedups up with regards to serial implementation.
Resumo:
Hyperspectral imaging can be used for object detection and for discriminating between different objects based on their spectral characteristics. One of the main problems of hyperspectral data analysis is the presence of mixed pixels, due to the low spatial resolution of such images. This means that several spectrally pure signatures (endmembers) are combined into the same mixed pixel. Linear spectral unmixing follows an unsupervised approach which aims at inferring pure spectral signatures and their material fractions at each pixel of the scene. The huge data volumes acquired by such sensors put stringent requirements on processing and unmixing methods. This paper proposes an efficient implementation of a unsupervised linear unmixing method on GPUs using CUDA. The method finds the smallest simplex by solving a sequence of nonsmooth convex subproblems using variable splitting to obtain a constraint formulation, and then applying an augmented Lagrangian technique. The parallel implementation of SISAL presented in this work exploits the GPU architecture at low level, using shared memory and coalesced accesses to memory. The results herein presented indicate that the GPU implementation can significantly accelerate the method's execution over big datasets while maintaining the methods accuracy.
Resumo:
Hyperspectral imaging has become one of the main topics in remote sensing applications, which comprise hundreds of spectral bands at different (almost contiguous) wavelength channels over the same area generating large data volumes comprising several GBs per flight. This high spectral resolution can be used for object detection and for discriminate between different objects based on their spectral characteristics. One of the main problems involved in hyperspectral analysis is the presence of mixed pixels, which arise when the spacial resolution of the sensor is not able to separate spectrally distinct materials. Spectral unmixing is one of the most important task for hyperspectral data exploitation. However, the unmixing algorithms can be computationally very expensive, and even high power consuming, which compromises the use in applications under on-board constraints. In recent years, graphics processing units (GPUs) have evolved into highly parallel and programmable systems. Specifically, several hyperspectral imaging algorithms have shown to be able to benefit from this hardware taking advantage of the extremely high floating-point processing performance, compact size, huge memory bandwidth, and relatively low cost of these units, which make them appealing for onboard data processing. In this paper, we propose a parallel implementation of an augmented Lagragian based method for unsupervised hyperspectral linear unmixing on GPUs using CUDA. The method called simplex identification via split augmented Lagrangian (SISAL) aims to identify the endmembers of a scene, i.e., is able to unmix hyperspectral data sets in which the pure pixel assumption is violated. The efficient implementation of SISAL method presented in this work exploits the GPU architecture at low level, using shared memory and coalesced accesses to memory.