965 resultados para Process analysis
Resumo:
Among the proposed treatments to repair lesions of degenerative joint disease (DJD), chondroprotective nutraceuticals composed by glucosamine and chondroitin sulfate are a non-invasive theraphy with properties that favors the health of the cartilage. Although used in human, it is also available for veterinary use with administration in the form of nutritional supplement independent of prescription, since they have registry only in the Inspection Service, which does not require safety and efficacy testing. The lack of such tests to prove efficacy and safety of veterinary medicines required by the Ministry of Agriculture and the lack of scientific studies proving its benefits raises doubts about the efficiency of the concentrations of such active substances. In this context, the objective of this study was to evaluate the efficacy of a veterinary chondroprotective nutraceutical based on chondroitin sulfate and glucosamine in the repair of osteochondral defects in lateral femoral condyle of 48 dogs, through clinical and radiographic analysis. The animals were divided into treatment group (TG) and control group (CG), so that only the TG received the nutraceutical every 24 hours at the rate recommended by the manufacturer. The results of the four treatment times (15, 30, 60 and 90 days) showed that the chondroprotective nutraceutical, in the rate, formulation and administration at the times used, did not improve clinical signs and radiologically did not influence in the repair process of the defects, since the treated and control groups showed similar radiographic findings at the end of the treatments.
Resumo:
Current software development often relies on non-trivial coordination logic for combining autonomous services, eventually running on different platforms. As a rule, however, such a coordination layer is strongly woven within the application at source code level. Therefore, its precise identification becomes a major methodological (and technical) problem and a challenge to any program understanding or refactoring process. The approach introduced in this paper resorts to slicing techniques to extract coordination data from source code. Such data are captured in a specific dependency graph structure from which a coordination model can be recovered either in the form of an Orc specification or as a collection of code fragments corresponding to the identification of typical coordination patterns in the system. Tool support is also discussed
Resumo:
What sort of component coordination strategies emerge in a software integration process? How can such strategies be discovered and further analysed? How close are they to the coordination component of the envisaged architectural model which was supposed to guide the integration process? This paper introduces a framework in which such questions can be discussed and illustrates its use by describing part of a real case-study. The approach is based on a methodology which enables semi-automatic discovery of coordination patterns from source code, combining generalized slicing techniques and graph manipulation
Resumo:
The work presented herein follows an ongoing research that aims to analyze methodological practices to be applied in Design Education. A reflection about methodological strategies in Design Education and the function of drawing in Design represents the beginning of this study. Then, we developed an interdisciplinary pedagogical experience with the Graphic Design 1st grade students from our institution (IPCA). In the current academic year, 2013/2014, we continue to evolve this project, introducing changes in the initial proposal. Major alterations focused on the aspects that could be strengthened in terms of interdisciplinarity. In this article, the authors describe those changes and discuss the outcomes of the novel proposal. As we have already reported, this investigation follows a reflection about working methods to be adopted in Design Education. This is in accordance with other previously published works that purpose the enlargement of Design into new knowledge fields such as Experience or Service Design, changing not only the role of the graphic designer, but also the skills required to be a professional designer (Alain Findelli, 2001), (Brian Lawson, 2006), (Ciampa-Brewer, 2010). Furthermore, concepts such as cooperation or multidisciplinary design, amongst others, have been frequently debated as design teaching strategies (Heller and Talarico, 2011, pp. 82-85). These educational approaches also have an impact on our research. The analysis of all these authors’ contributions together with a reflection on our teaching practice allowed us to propose an improved interdisciplinary intervention.
Resumo:
Work accidents affect business and society as a whole. Fewer accidents mean fewer sick leaves, which results in lower costs and less disruption in the production process, with clear advantages for the employer. But workers and their households bear also a significant burden following a work accident, only partially compen-sated by insurance systems. Furthermore, the consequences of work accidents to the State and Society need also to be considered. When an organization performs an integrated risk analysis in evaluating its Occupational Health and Safety Management System, several steps are suggested to address the identified risk situations. Namely, to avoid risks, a series of preventive measures are identified. The organization should make a detailed analysis of the monetary impact (positive or negative) for the organization of each of the measures considered. Particularly, it is also important to consider the impact of each measure on society, involving an adequate eco-nomic cost-benefit analysis. In the present paper, a case study in a textile finishing company is presented. The study concentrates on the dyeing and printing sections. For each of the potential risks, several preventive measures have been identified and the corresponding costs and benefits have been estimated. Subsequently, the Benefit/Cost ratio (B/C) of these measures has been calculated, both in financial terms (from the organisa-tion’s perspective) and in economic terms (including the benefits for the worker and for the Society). Results show that, while the financial analysis in terms of the company does not justify the preventive measures, when the externalities are taken into account, the B/C ratio increases significantly and investments are fully justified.
Resumo:
The role of middle management is essential when managing integrative and emergent strategy formation processes. We stand out the importance of its role connecting micro and macro organizational level offering a very important contribution when examining the strategy-as-practice perspective and integrative strategy formation process. The main goal of this research is to analyse the relationship between the integrative strategy formation process and the roles of middle management under the strategy-as-practice perspective. To check it out we adopted a qualitative methodology droving a case analysis in a Spanish University. Data was collected by means of personal interviews with members of different levels of the Institution, documents analysis and direct observation. In advance of some results we find out that the University develops an integrative strategy formation process and confers to middle management an important role extended all over the organization.
Resumo:
In an increasingly complex society, regulatory polices emerge as an important tool in public management. Nevertheless, regulation per se is no longer enough, and the agenda for a regulatory reform is increasing. Following this context, Brazil has implemented Regulatory Impact Analysis (RIA) in its regulatory agencies. Thus, Brazilian specificities have to be considered and, in this regard, a systematic approach provides a significant contribution. This article aims to address some critical reflections about which policy-makers should ask themselves before joining the implementation of a RIA system in the Brazilian context. Through a long-term perspective, the implementation of RIA must be seen as part of a permanent change in the administrative culture, understanding that RIA should be used as a further resource in the decision-making process, rather than a final solution.
Resumo:
Abstract: The aim of this study was to characterize the trajectory of answerability in Brazil. In the light of studies based on the historical neo-institutionalism approach, formal institutional changes adopted at federal level between 1985 and 2014, and which favor the typical requirements of answerability - information and justification - were identified and analyzed through the content analysis technique. The conclusion is that the trajectory of answerability in contemporary Brazil can be characterized as continuous, primarily occurring through the layering strategy, and whose leitmotif, since its origin, has consisted of matters of financial and budgetary nature. Nevertheless, a recent influence of deeper democratic subjects on it has been observed.
Resumo:
Benchmarking is an important tool to organisations to improve their productivity, product quality, process efficiency or services. From Benchmarking the organisations could compare their performance with competitors and identify their strengths and weaknesses. This study intends to do a benchmarking analysis on the main Iberian Sea ports with a special focus on their container terminals efficiency. To attain this, the DEA (data envelopment analysis) is used since it is considered by several researchers as the most effective method to quantify a set of key performance indicators. In order to reach a more reliable diagnosis tool the DEA is used together with the data mining in comparing the sea ports operational data of container terminals during 2007.Taking into account that sea ports are global logistics networks the performance evaluation is essential to an effective decision making in order to improve their efficiency and, therefore, their competitiveness.
Resumo:
The kraft pulps produced from heartwood and sapwood of Eucalyptus globulus at 130 degrees C, 150 degrees C, and 170 degrees C were characterized by wet chemistry (total lignin as sum of Klason and soluble lignin fractions) and pyrolysis (total lignin denoted as py-lignin). The total lignin content obtained with both methods was similar. In the course of delignification, the py-lignin values were higher (by 2 to 5%) compared to Klason values, which is in line with the importance of soluble lignin for total lignin determination. Pyrolysis analysis presents advantages over wet chemical procedures, and it can be applied to wood and pulps to determine lignin contents at different stages of the delignification process. The py-lignin values were used for kinetic modelling of delignification, with very high predictive value and results similar to those of modelling using wet chemical determinations.
Resumo:
A QuEChERS method has been developed for the determination of 14 organochlorine pesticides in 14 soils from different Portuguese regions with wide range composition. The extracts were analysed by GC-ECD (where GC-ECD is gas chromatography-electron-capture detector) and confirmed by GC-MS/MS (where MS/MS is tandem mass spectrometry). The organic matter content is a key factor in the process efficiency. An optimization was carried out according to soils organic carbon level, divided in two groups: HS (organic carbon>2.3%) and LS (organic carbon<2.3%). Themethod was validated through linearity, recovery, precision and accuracy studies. The quantification was carried out using a matrixmatched calibration to minimize the existence of the matrix effect. Acceptable recoveries were obtained (70–120%) with a relative standard deviation of ≤16% for the three levels of contamination. The ranges of the limits of detection and of the limits of quantification in soils HS were from 3.42 to 23.77 μg kg−1 and from 11.41 to 79.23 μg kg−1, respectively. For LS soils, the limits of detection ranged from 6.11 to 14.78 μg kg−1 and the limits of quantification from 20.37 to 49.27 μg kg−1. In the 14 collected soil samples only one showed a residue of dieldrin (45.36 μg kg−1) above the limit of quantification. This methodology combines the advantages of QuEChERS, GC-ECD detection and GC-MS/MS confirmation producing a very rapid, sensitive and reliable procedure which can be applied in routine analytical laboratories.
Resumo:
Purpose: The aim of this paper is to promote qualitative methodology within the scientific community of management. The specific objective is oriented to propose an empirical research process based on case study method. This is to ensure rigor in the empirical research process, that future research may follow a similar procedure to that is proposed. Design/methodology/approach: Following a qualitative methodological approach, we propose a research process that develops according to four phases, each with several stages. This study analyses the preparatory and field work phases and their stages. Findings: The paper shows the influence that case studies have on qualitative empirical research process in management. Originality/value:. Case study method assumes an important role within qualitative research by allowing for the study and analysis of certain types of phenomena that occur inside organisations, and in respect of which quantitative studies cannot provide an answer.
Resumo:
Dissertação de Mestrado em Gestão de Empresas/MBA.
Resumo:
Independent component analysis (ICA) has recently been proposed as a tool to unmix hyperspectral data. ICA is founded on two assumptions: 1) the observed spectrum vector is a linear mixture of the constituent spectra (endmember spectra) weighted by the correspondent abundance fractions (sources); 2)sources are statistically independent. Independent factor analysis (IFA) extends ICA to linear mixtures of independent sources immersed in noise. Concerning hyperspectral data, the first assumption is valid whenever the multiple scattering among the distinct constituent substances (endmembers) is negligible, and the surface is partitioned according to the fractional abundances. The second assumption, however, is violated, since the sum of abundance fractions associated to each pixel is constant due to physical constraints in the data acquisition process. Thus, sources cannot be statistically independent, this compromising the performance of ICA/IFA algorithms in hyperspectral unmixing. This paper studies the impact of hyperspectral source statistical dependence on ICA and IFA performances. We conclude that the accuracy of these methods tends to improve with the increase of the signature variability, of the number of endmembers, and of the signal-to-noise ratio. In any case, there are always endmembers incorrectly unmixed. We arrive to this conclusion by minimizing the mutual information of simulated and real hyperspectral mixtures. The computation of mutual information is based on fitting mixtures of Gaussians to the observed data. A method to sort ICA and IFA estimates in terms of the likelihood of being correctly unmixed is proposed.
Resumo:
Linear unmixing decomposes a hyperspectral image into a collection of reflectance spectra of the materials present in the scene, called endmember signatures, and the corresponding abundance fractions at each pixel in a spatial area of interest. This paper introduces a new unmixing method, called Dependent Component Analysis (DECA), which overcomes the limitations of unmixing methods based on Independent Component Analysis (ICA) and on geometrical properties of hyperspectral data. DECA models the abundance fractions as mixtures of Dirichlet densities, thus enforcing the constraints on abundance fractions imposed by the acquisition process, namely non-negativity and constant sum. The mixing matrix is inferred by a generalized expectation-maximization (GEM) type algorithm. The performance of the method is illustrated using simulated and real data.