948 resultados para generic finiteness
Resumo:
Ao profissional de hoje é exigido o domínio de competências que transcendem a sua própria função, área ou nível de qualificação, colocando-se assim às empresas o desafio de identificar e desenvolver as referidas competências, a partir de uma gestão de recursos humanos (GRH) que tenha por base as competências, seja ela uma gestão mais ou menos formalizada. A gestão por competências permite uma gestão estratégica, integrada e coerente dos processos de GRH, na medida em que poderá ser transversal a todos os seus subsistemas e acontece de forma articulada com os objetivos globais do negócio. Neste contexto, emerge a investigação que se segue, um estudo exploratório, de cariz qualitativo que tem como objetivos compreender em profundidade a realidade de diversas empresas em termos de competências transversais valorizadas e as suas práticas de gestão de recursos humanos baseadas em competências. Entrevistamos dez gestores de recursos humanos e administradores de empresas, com um número diferenciado de trabalhadores, representando assim as micro, peque- nas, médias e grandes empresas do norte de Portugal. Concluímos que as competências transversais mais valorizadas pelas empresas são a flexibilidade, relacionamento interpessoal, adaptação à mudança e trabalho em equipa. Esta investigação permitiu ainda compreender que a presença das competências na GRH é caracterizada por uma forte informalidade. Nesta informalidade, as competências transversais estão presentes na contratação, na retenção e nos planos de desenvolvimento, sendo menos frequente a sua utilização em práticas como a gestão e avaliação de desempenho, gestão de carreiras e gestão de benefícios e recompensas. Estes resultados representam vantagens para a produção científica e para as empre- sas, sistemas de ensino, profissionais e estudantes, não só pela importância que as competências e as competências transversais assumem no mercado de hoje e porque fornecem dados atualizados e pistas para investigações futuras.
Resumo:
After a historical introduction, the bulk of the thesis concerns the study of a declarative semantics for logic programs. The main original contributions are: ² WFSX (Well–Founded Semantics with eXplicit negation), a new semantics for logic programs with explicit negation (i.e. extended logic programs), which compares favourably in its properties with other extant semantics. ² A generic characterization schema that facilitates comparisons among a diversity of semantics of extended logic programs, including WFSX. ² An autoepistemic and a default logic corresponding to WFSX, which solve existing problems of the classical approaches to autoepistemic and default logics, and clarify the meaning of explicit negation in logic programs. ² A framework for defining a spectrum of semantics of extended logic programs based on the abduction of negative hypotheses. This framework allows for the characterization of different levels of scepticism/credulity, consensuality, and argumentation. One of the semantics of abduction coincides with WFSX. ² O–semantics, a semantics that uniquely adds more CWA hypotheses to WFSX. The techniques used for doing so are applicable as well to the well–founded semantics of normal logic programs. ² By introducing explicit negation into logic programs contradiction may appear. I present two approaches for dealing with contradiction, and show their equivalence. One of the approaches consists in avoiding contradiction, and is based on restrictions in the adoption of abductive hypotheses. The other approach consists in removing contradiction, and is based in a transformation of contradictory programs into noncontradictory ones, guided by the reasons for contradiction.
Resumo:
The main objective of this work was to investigate the application of experimental design techniques for the identification of Michaelis-Menten kinetic parameters. More specifically, this study attempts to elucidate the relative advantages/disadvantages of employing complex experimental design techniques in relation to equidistant sampling when applied to different reactor operation modes. All studies were supported by simulation data of a generic enzymatic process that obeys to the Michaelis-Menten kinetic equation. Different aspects were investigated, such as the influence of the reactor operation mode (batch, fed-batch with pulse wise feeding and fed-batch with continuous feeding) and the experimental design optimality criteria on the effectiveness of kinetic parameters identification. The following experimental design optimality criteria were investigated: 1) minimization of the sum of the diagonal of the Fisher information matrix (FIM) inverse (A-criterion), 2) maximization of the determinant of the FIM (D-criterion), 3) maximization of the smallest eigenvalue of the FIM (E-criterion) and 4) minimization of the quotient between the largest and the smallest eigenvalue (modified E-criterion). The comparison and assessment of the different methodologies was made on the basis of the Cramér-Rao lower bounds (CRLB) error in respect to the parameters vmax and Km of the Michaelis-Menten kinetic equation. In what concerns the reactor operation mode, it was concluded that fed-batch (pulses) is better than batch operation for parameter identification. When the former operation mode is adopted, the vmax CRLB error is lowered by 18.6 % while the Km CRLB error is lowered by 26.4 % when compared to the batch operation mode. Regarding the optimality criteria, the best method was the A-criterion, with an average vmax CRLB of 6.34 % and 5.27 %, for batch and fed-batch (pulses), respectively, while presenting a Km’s CRLB of 25.1 % and 18.1 %, for batch and fed-batch (pulses), respectively. As a general conclusion of the present study, it can be stated that experimental design is justified if the starting parameters CRLB errors are inferior to 19.5 % (vmax) and 45% (Km), for batch processes, and inferior to 42 % and to 50% for fed-batch (pulses) process. Otherwise equidistant sampling is a more rational decision. This conclusion clearly supports that, for fed-batch operation, the use of experimental design is likely to largely improve the identification of Michaelis-Menten kinetic parameters.
Resumo:
Real-time systems demand guaranteed and predictable run-time behaviour in order to ensure that no task has missed its deadline. Over the years we are witnessing an ever increasing demand for functionality enhancements in the embedded real-time systems. Along with the functionalities, the design itself grows more complex. Posed constraints, such as energy consumption, time, and space bounds, also require attention and proper handling. Additionally, efficient scheduling algorithms, as proven through analyses and simulations, often impose requirements that have significant run-time cost, specially in the context of multi-core systems. In order to further investigate the behaviour of such systems to quantify and compare these overheads involved, we have developed the SPARTS, a simulator of a generic embedded real- time device. The tasks in the simulator are described by externally visible parameters (e.g. minimum inter-arrival, sporadicity, WCET, BCET, etc.), rather than the code of the tasks. While our current implementation is primarily focused on our immediate needs in the area of power-aware scheduling, it is designed to be extensible to accommodate different task properties, scheduling algorithms and/or hardware models for the application in wide variety of simulations. The source code of the SPARTS is available for download at [1].
Resumo:
Mestrado em Controlo de Gestão e dos Negócios
Resumo:
Desde o início da utilização da imunohistoquímica em anatomia patológica, um dos objetivos tem sido detetar as quantidades mais ínfimas de antigénio, tornando-o visível ao microscópio ótico. Vários sistemas de amplificação têm sido aplicados de forma a concretizar este objetivo, tendo surgido um grupo genérico de métodos simples e que apresentam uma amplificação superior: são os denominados métodos do polímero indireto. Tendo em conta a variedade de métodos disponíveis, o autor propõe-se a comparar a qualidade de quatro sistemas de amplificação, que recorrem ao método do polímero indireto com horseradish peroxidase (HRP). Foram utilizadas lâminas de diferentes tecidos, fixados em formol e incluídos em parafina, nos quais se procedeu à identificação de 15 antigénios distintos. Na amplificação recorreu-se a quatro sistemas de polímero indireto (Dako EnVision+ System – K4006; LabVision UltraVision LP Detection System – TL-004-HD; Leica NovoLink – RE7140-k; Vector ImmPRESS Reagent Kit – MP-7402). A observação microscópica e classificação da imunomarcação obtida foram feitas com base num algoritmo que enquadra intensidade, marcação específica, marcação inespecífica e contraste, num score global que pode tomar valores entre 0 e 25. No tratamento dos dados, para além da estatística descritiva, foi utilizado o teste one-way ANOVA com posthoc de tukey (alfa=0.05). O melhor resultado obtido, em termos de par média/desvio-padrão, dos scores globais foi o do NovoLink (22,4/2,37) e o pior foi o do EnVision+ (17,43/3,86). Verificou-se ainda que existe diferença estatística entre os resultados obtidos pelo sistema NovoLink e os sistemas UltraVision (p=.004), ImmPRESS (p=.000) e EnVision+ (p=.000). Concluiu-se que o sistema que permitiu a obtenção de melhores resultados, neste estudo, foi o Leica NovoLink.
Resumo:
The evolution of the Lusitanian Basin, localized on the western Iberian margin, is closely associated with the first opening phases of the North Atlantic. It persisted from the Late Triassic to the Early Cretaceous, more precisely until the end of the Early Aptian, and its evolution was conditioned by inherited structures from the variscan basement. The part played by the faults that establish its boundaries, as regards the geometric and kinematic evolution and the organization of the sedimentary bodies, is discussed here, as well as with respect to important faults transversal to the Basin. A basin evolution model is proposed consisting of four rifting episodes which show: i) periods of symmetrical (horst and graben organization) and asymmetrical (half graben organization) geometric evolution; ii) diachronous fracturing; iii) rotation of the main extensional direction; iv) rooting in the variscan basement of the main faults of the basin (predominantly thick skinned style). The analysis and regional comparison, particularly with the Algarve Basin, of the time intervals represented by important basin scale hiatuses near to the renovation of the rifting episodes, have led to assume the occurrence of early tectonic inversions (Callovian–Oxfordian and Tithonian–Berriasian). The latter, however, had a subsequent evolution distinct from the first: there is no subsidence renovation, which is discussed here, and it is related to a magmatic event. Although the Lusitanian Basin is located on a rift margin which is considered non-volcanic, the three magmatic cycles as defined by many authors, particularly the second (approx. 130 to 110 My ?), performed a fundamental part in the mobilization of the Hettangian evaporites, resulting in the main diapiric events of the Lusitanian Basin. The manner and time in which the basin definitely ends its evolution (Early Aptian) is discussed here. Comparisons are established with other west Iberian margin basins and with Newfoundland basins. A model of oceanization of this area of the North Atlantic is also presented, consisting of two events separated by approximately 10 My, and of distinct areas separated by the Nazaré fault. The elaboration of this synthesis was based on: - information contained in previously published papers (1990 – 2000); - field-work carried out over the last years, the results of which have not yet been published; - information gathered from the reinterpretation of geological mapping and geophysical (seismic and well logs) elements, and from generic literature concerning the Mesozoic of the west iberian margin.
Resumo:
Applications with soft real-time requirements can benefit from code mobility mechanisms, as long as those mechanisms support the timing and Quality of Service requirements of applications. In this paper, a generic model for code mobility mechanisms is presented. The proposed model gives system designers the necessary tools to perform a statistical timing analysis on the execution of the mobility mechanisms that can be used to determine the impact of code mobility in distributed real-time applications.
Resumo:
We have developed SPARTS, a simulator of a generic embedded real-time device. It is designed to be extensible to accommodate different task properties, scheduling algorithms and/or hardware models for the wide variety of applications. SPARTS was developed to help the community investigate the behaviour of the real-time embedded systems and to quantify the associated constraints/overheads.
Resumo:
Simulators are indispensable tools to support the development and testing of cooperating objects such as wireless sensor networks (WSN). However, it is often not possible to compare the results of different simulation tools. Thus, the goal of this paper is the specification of a generic simulation platform for cooperating objects. We propose a platform that consists of a set of simulators that together fulfill desired simulator properties. We show that to achieve comparable results the use of a common specification language for the software-under-test is not feasible. Instead, we argue that using common input formats for the simulated environment and common output formats for the results is useful. This again motivates that a simulation tool consisting of a set of existing simulators that are able to use common scenario-input and can produce common output which will bring us a step closer to the vision of achieving comparable simulation results.
Resumo:
In this thesis we implement estimating procedures in order to estimate threshold parameters for the continuous time threshold models driven by stochastic di®erential equations. The ¯rst procedure is based on the EM (expectation-maximization) algorithm applied to the threshold model built from the Brownian motion with drift process. The second procedure mimics one of the fundamental ideas in the estimation of the thresholds in time series context, that is, conditional least squares estimation. We implement this procedure not only for the threshold model built from the Brownian motion with drift process but also for more generic models as the ones built from the geometric Brownian motion or the Ornstein-Uhlenbeck process. Both procedures are implemented for simu- lated data and the least squares estimation procedure is also implemented for real data of daily prices from a set of international funds. The ¯rst fund is the PF-European Sus- tainable Equities-R fund from the Pictet Funds company and the second is the Parvest Europe Dynamic Growth fund from the BNP Paribas company. The data for both funds are daily prices from the year 2004. The last fund to be considered is the Converging Europe Bond fund from the Schroder company and the data are daily prices from the year 2005.
Resumo:
Workflows have been successfully applied to express the decomposition of complex scientific applications. This has motivated many initiatives that have been developing scientific workflow tools. However the existing tools still lack adequate support to important aspects namely, decoupling the enactment engine from workflow tasks specification, decentralizing the control of workflow activities, and allowing their tasks to run autonomous in distributed infrastructures, for instance on Clouds. Furthermore many workflow tools only support the execution of Direct Acyclic Graphs (DAG) without the concept of iterations, where activities are executed millions of iterations during long periods of time and supporting dynamic workflow reconfigurations after certain iteration. We present the AWARD (Autonomic Workflow Activities Reconfigurable and Dynamic) model of computation, based on the Process Networks model, where the workflow activities (AWA) are autonomic processes with independent control that can run in parallel on distributed infrastructures, e. g. on Clouds. Each AWA executes a Task developed as a Java class that implements a generic interface allowing end-users to code their applications without concerns for low-level details. The data-driven coordination of AWA interactions is based on a shared tuple space that also enables support to dynamic workflow reconfiguration and monitoring of the execution of workflows. We describe how AWARD supports dynamic reconfiguration and discuss typical workflow reconfiguration scenarios. For evaluation we describe experimental results of AWARD workflow executions in several application scenarios, mapped to a small dedicated cluster and the Amazon (Elastic Computing EC2) Cloud.
Resumo:
Dissertação apresentada para a obtenção do Grau de Doutor em Informática pela Universidade Nova de Lisboa, Faculdade de Ciências e Tecnologia
Resumo:
OBJECTIVE To analyze the incremental cost-utility ratio for the surgical treatment of hip fracture in older patients.METHODS This was a retrospective cohort study of a systematic sample of patients who underwent surgery for hip fracture at a central hospital of a macro-region in the state of Minas Gerais, Southeastern Brazil between January 1, 2009 and December 31, 2011. A decision tree creation was analyzed considering the direct medical costs. The study followed the healthcare provider’s perspective and had a one-year time horizon. Effectiveness was measured by the time elapsed between trauma and surgery after dividing the patients into early and late surgery groups. The utility was obtained in a cross-sectional and indirect manner using the EuroQOL 5 Dimensions generic questionnaire transformed into cardinal numbers using the national regulations established by the Center for the Development and Regional Planning of the State of Minas Gerais. The sample included 110 patients, 27 of whom were allocated in the early surgery group and 83 in the late surgery group. The groups were stratified by age, gender, type of fracture, type of surgery, and anesthetic risk.RESULTS The direct medical cost presented a statistically significant increase among patients in the late surgery group (p < 0.005), mainly because of ward costs (p < 0.001). In-hospital mortality was higher in the late surgery group (7.4% versus 16.9%). The decision tree demonstrated the dominance of the early surgery strategy over the late surgery strategy: R$9,854.34 (USD4,387.17) versus R$26,754.56 (USD11,911.03) per quality-adjusted life year. The sensitivity test with extreme values proved the robustness of the results.CONCLUSIONS After controlling for confounding variables, the strategy of early surgery for hip fracture in the older adults was proven to be dominant, because it presented a lower cost and better results than late surgery.
Resumo:
Dissertação apresentada para a obtenção do Grau de Doutor em Bioquímica, especialidade de Bioquímica-Física pela Universidade Nova de Lisboa, Faculdade de Ciências e Tecnologia