39 resultados para Corporate regulation framework
em Repositório Científico do Instituto Politécnico de Lisboa - Portugal
Resumo:
A presente investigação procurou descrever, de forma exaustiva, o processo de previsão, negociação, implementação e avaliação do Contrato de Execução celebrado entre a Câmara Municipal de Sintra e o Ministério da Educação em 2009. Este contrato corresponde a um instrumento previsto na regulamentação do quadro de transferências de competências para os municípios em matéria de educação, de acordo com o regime previsto no Decreto-Lei n.º 144/2008, de 28 de julho. Definida a problemática e os objetivos, a investigação centrou-se num estudo de caso no qual foi feita a descrição e interpretação do processo e das ações desenvolvidas pelos intervenientes no período compreendido entre 2008 e 2011. Recorreu-se à confrontação dos dados obtidos através da análise das fontes documentais e do recurso às entrevistas realizadas aos responsáveis pelo Pelouro da Educação e diretores dos Agrupamentos de Escolas, à luz da revisão da literatura e do contributo de diferentes trabalhos de investigadores nesta matéria. A investigação permitiu concluir que o processo de contratualização foi algo complexo face à realidade deste Município e que o normativo apresenta várias lacunas no que diz respeito à contratualização da referida transferência de competências, designadamente porque procura generalizar algo que não é, de todo, generalizável – o campo da educação face à complexidade dos territórios educativos em causa e aos dos intervenientes envolvidos no mesmo.
Resumo:
This paper studies the evolution of the default risk premia for European firms during the years surrounding the recent credit crisis. We employ the information embedded in Credit Default Swaps (CDS) and Moody’s KMV EDF default probabilities to analyze the common factors driving this risk premia. The risk premium is characterized in several directions: Firstly, we perform a panel data analysis to capture the relationship between CDS spreads and actual default probabilities. Secondly, we employ the intensity framework of Jarrow et al. (2005) in order to measure the theoretical effect of risk premium on expected bond returns. Thirdly, we carry out a dynamic panel data to identify the macroeconomic sources of risk premium. Finally, a vector autoregressive model analyzes which proportion of the co-movement is attributable to financial or macro variables. Our estimations report coefficients for risk premium substantially higher than previously referred for US firms and a time varying behavior. A dominant factor explains around 60% of the common movements in risk premia. Additionally, empirical evidence suggests a public-to-private risk transfer between the sovereign CDS spreads and corporate risk premia.
Dos prejuízos no regime de participation exemption: a relação entre estabelecimento estável e filial
Resumo:
Mestrado em Fiscalidade
Resumo:
No início da década de 90, as empresas começaram a sentir a necessidade de melhorar o acesso à informação das suas actividades para auxiliar na tomada de decisões. Desta forma, no mundo da informática, emergiu o sector Business Intelligence (BI) composto inicialmente por data warehousing e ferramentas de geração de relatórios. Ao longo dos anos o conceito de BI evoluiu de acordo com as necessidades empresariais, tornando a análise das actividades e do desempenho das organizações em aspectos críticos na gestão das mesmas. A área de BI abrange diversos sectores, sendo o de geração de relatórios e o de análise de dados aqueles que melhor preenchem os requisitos pretendidos no controlo de acesso à informação do negócio e respectivos processos. Actualmente o tempo e a informação são vantagens competitivas e por esse mesmo motivo as empresas estão cada vez mais preocupadas com o facto de o aumento do volume de informação estar a tornar-se insustentável na medida que o tempo necessário para processar a informação é cada vez maior. Por esta razão muitas empresas de software, tais como Microsoft, IBM e Oracle estão numa luta por um lugar neste mercado de BI em expansão. Para que as empresas possam ser competitivas, a sua capacidade de previsão e resposta às necessidades de mercado em tempo real é requisito principal, em detrimento da existência apenas de uma reacção a uma necessidade que peca por tardia. Os produtos de BI têm fama de trabalharem apenas com dados históricos armazenados, o que faz com que as empresas não se possam basear nessas soluções quando o requisito de alguns negócios é de tempo quase real. A latência introduzida por um data warehouse é demasiada para que o desempenho seja aceitável. Desta forma, surge a tecnologia Business Activity Monitoring (BAM) que fornece análise de dados e alertas em tempo quase real sobre os processos do negócio, utilizando fontes de dados como Web Services, filas de mensagens, etc. O conceito de BAM surgiu em Julho de 2001 pela organização Gartner, sendo uma extensão orientada a eventos da área de BI. O BAM define-se pelo acesso em tempo real aos indicadores de desempenho de negócios com o intuito de aumentar a velocidade e eficácia dos processos de negócio. As soluções BAM estão a tornar-se cada vez mais comuns e sofisticadas.
Resumo:
With this article we intend to contribute to the understanding of what can make Online Collaborative Teams (OCT) effective. This is done by identifying what can be considered best practices for individual team members, for leaders of OCT, and for the organizations that the teams are a part of. Best practices in these categories were identified from the existing literature related to online teams and collaborative work literature.
Resumo:
One of the most efficient approaches to generate the side information (SI) in distributed video codecs is through motion compensated frame interpolation where the current frame is estimated based on past and future reference frames. However, this approach leads to significant spatial and temporal variations in the correlation noise between the source at the encoder and the SI at the decoder. In such scenario, it would be useful to design an architecture where the SI can be more robustly generated at the block level, avoiding the creation of SI frame regions with lower correlation, largely responsible for some coding efficiency losses. In this paper, a flexible framework to generate SI at the block level in two modes is presented: while the first mode corresponds to a motion compensated interpolation (MCI) technique, the second mode corresponds to a motion compensated quality enhancement (MCQE) technique where a low quality Intra block sent by the encoder is used to generate the SI by doing motion estimation with the help of the reference frames. The novel MCQE mode can be overall advantageous from the rate-distortion point of view, even if some rate has to be invested in the low quality Intra coding blocks, for blocks where the MCI produces SI with lower correlation. The overall solution is evaluated in terms of RD performance with improvements up to 2 dB, especially for high motion video sequences and long Group of Pictures (GOP) sizes.
Resumo:
Actualmente, não existem ferramentas open source de Business Intelligence (BI) para suporte à gestão e análise financeira nas empresas, de acordo com o sistema de normalização contabilística (SNC). As diferentes características de cada negócio, juntamente com os requisitos impostos pelo SNC, tornam complexa a criação de uma Framework financeira genérica, que satisfaça, de forma eficiente, as análises financeiras necessárias à gestão das empresas. O objectivo deste projecto é propor uma framework baseada em OLAP, capaz de dar suporte à gestão contabilística e análise financeira, recorrendo exclusivamente a software open source na sua implementação, especificamente, a plataforma Pentaho. Toda a informação contabilística, obtida através da contabilidade geral, da contabilidade analítica, da gestão orçamental e da análise financeira é armazenada num Data mart. Este Data mart suportará toda a análise financeira, incluindo a análise de desvios orçamentais e de fluxo de capitais, permitindo às empresas ter uma ferramenta de BI, compatível com o SNC, que as ajude na tomada de decisões.
Resumo:
Recently, several distributed video coding (DVC) solutions based on the distributed source coding (DSC) paradigm have appeared in the literature. Wyner-Ziv (WZ) video coding, a particular case of DVC where side information is made available at the decoder, enable to achieve a flexible distribution of the computational complexity between the encoder and decoder, promising to fulfill novel requirements from applications such as video surveillance, sensor networks and mobile camera phones. The quality of the side information at the decoder has a critical role in determining the WZ video coding rate-distortion (RD) performance, notably to raise it to a level as close as possible to the RD performance of standard predictive video coding schemes. Towards this target, efficient motion search algorithms for powerful frame interpolation are much needed at the decoder. In this paper, the RD performance of a Wyner-Ziv video codec is improved by using novel, advanced motion compensated frame interpolation techniques to generate the side information. The development of these type of side information estimators is a difficult problem in WZ video coding, especially because the decoder only has available some reference, decoded frames. Based on the regularization of the motion field, novel side information creation techniques are proposed in this paper along with a new frame interpolation framework able to generate higher quality side information at the decoder. To illustrate the RD performance improvements, this novel side information creation framework has been integrated in a transform domain turbo coding based Wyner-Ziv video codec. Experimental results show that the novel side information creation solution leads to better RD performance than available state-of-the-art side information estimators, with improvements up to 2 dB: moreover, it allows outperforming H.264/AVC Intra by up to 3 dB with a lower encoding complexity.
Resumo:
Longevity risk is one of the major risks that an insurance company or a pension fund has to deal with and it is expected that its importance will grow in the near future. In agreement with these considerations, in Solvency II regulation the Standard formula furnished for calculating the Solvency Capital Requirement explicitly considers this kind of risk. According to the new European rules in our paper we suggest a multiperiod approach to evaluate the SCR for longevity risk. We propose a backtesting framework for measuring the consistency of SCR calculations for life insurance policies.
Resumo:
The main purpose of this study is to analyse the changes caused by the global financial crisis on the influence of board characteristics on corporate results, in terms of corporate performance, corporate risk-taking, and earnings management. Sample comprises S&P 500 listed firms during 2002-2008. This study reveals that the environmental conditions call for different behaviour from directors to fulfil their responsibilities and suggests changes in normative and voluntary guidelines for improving good practices in the boardroom.
Resumo:
The importance of Social Responsibility (SR) is higher if this business variable is related with other ones of strategic nature in business activity (competitive success that the company achieved, performance that the firms develop and innovations that they carries out). The hypothesis is that organizations that focus on SR are those who get higher outputs and innovate more, achieving greater competitive success. A scale for measuring the orientation to SR has defined in order to determine the degree of relationship between above elements. This instrument is original because previous scales do not exist in the literature which could measure, on the one hand, the three classics sub-constructs theoretically accepted that SR is made up and, on the other hand, the relationship between SR and the other variables. As a result of causal relationships analysis we conclude with a scale of 21 indicators, validated scale with a sample of firms belonging to the Autonomous Community of Extremadura and it is the first empirical validation of these dimensions we know so far, in this context.
Resumo:
Acetylcholine (ACh) has been shown to exert an anti-inflammatory function by down-modulating the expression of pro-inflammatory cytokines. Its availability can be regulated at different levels, namely at its synthesis and degradation steps. Accordingly, the expression of acetylcholinesterase (AChE), the enzyme responsible for ACh hydrolysis, has been observed to be modulated in inflammation. To further address the mechanisms underlying this effect, we aimed here at characterizing AChE expression in distinct cellular types pivotal to the inflammatory response. This study was performed in the human acute leukaemia monocytyc cell line, THP-1, in human monocyte-derived primary macrophages and in human umbilical cord vein endothelial cells (HUVEC). In order to subject these cells to inflammatory conditions, THP-1 and macrophage were treated with lipopolysaccharide (LPS) from E.coli and HUVEC were stimulated with the tumour necrosis factor α (TNF-α). Our results showed that although AChE expression was generally up-regulated at the mRNA level under inflammatory conditions, distinct AChE protein expression profiles were aurprisingly observed among the distinct cellular types studied. Altogether, these results argue for the existence of cell specific mechanisms that regulate the expression of acetylcholinesterase in inflammation.
Resumo:
The purpose of this paper is to analyse how educational policies about school violence are reinterpreted and implemented at school level and if this process contributes to a more pluralistic and democratic school. A research carried out in 3 clusters of schools showed that the diversity of understandings and strategies to face school violence, higher within the territories than between them, was associated to the school board's agendas and the legitimacy of the different actors to interpret and act within the national policies framework. There was a high consistency between violence management strategies and the ways schools faced social and cultural diversity. Those who favour more inclusive strategies to deal with violence tend to provide higher educational opportunities in schools, inversely, those who favour repressive strategies are more likely to support educational and social selective strategies, with less educational offer; less participation of teachers, students and parents in violence regulation.
Expert opinion on best practice guidelines and competency framework for visual screening in children
Resumo:
PURPOSE: Screening programs to detect visual abnormalities in children vary among countries. The aim of this study is to describe experts' perception of best practice guidelines and competency framework for visual screening in children. METHODS: A qualitative focus group technique was applied during the Portuguese national orthoptic congress to obtain the perception of an expert panel of 5 orthoptists and 2 ophthalmologists with experience in visual screening for children (mean age 53.43 years, SD ± 9.40). The panel received in advance a script with the description of three tuning competencies dimensions (instrumental, systemic, and interpersonal) for visual screening. The session was recorded in video and audio. Qualitative data were analyzed using a categorical technique. RESULTS: According to experts' views, six tests (35.29%) have to be included in a visual screening: distance visual acuity test, cover test, bi-prism or 4/6(Δ) prism, fusion, ocular movements, and refraction. Screening should be performed according to the child age before and after 3 years of age (17.65%). The expert panel highlighted the influence of the professional experience in the application of a screening protocol (23.53%). They also showed concern about the false negatives control (23.53%). Instrumental competencies were the most cited (54.09%), followed by interpersonal (29.51%) and systemic (16.4%). CONCLUSIONS: Orthoptists should have professional experience before starting to apply a screening protocol. False negative results are a concern that has to be more thoroughly investigated. The proposed framework focuses on core competencies highlighted by the expert panel. Competencies programs could be important do develop better screening programs.
Resumo:
PURPOSE: Fatty liver disease (FLD) is an increasing prevalent disease that can be reversed if detected early. Ultrasound is the safest and ubiquitous method for identifying FLD. Since expert sonographers are required to accurately interpret the liver ultrasound images, lack of the same will result in interobserver variability. For more objective interpretation, high accuracy, and quick second opinions, computer aided diagnostic (CAD) techniques may be exploited. The purpose of this work is to develop one such CAD technique for accurate classification of normal livers and abnormal livers affected by FLD. METHODS: In this paper, the authors present a CAD technique (called Symtosis) that uses a novel combination of significant features based on the texture, wavelet transform, and higher order spectra of the liver ultrasound images in various supervised learning-based classifiers in order to determine parameters that classify normal and FLD-affected abnormal livers. RESULTS: On evaluating the proposed technique on a database of 58 abnormal and 42 normal liver ultrasound images, the authors were able to achieve a high classification accuracy of 93.3% using the decision tree classifier. CONCLUSIONS: This high accuracy added to the completely automated classification procedure makes the authors' proposed technique highly suitable for clinical deployment and usage.