17 resultados para Project Read and Write


Relevância:

100.00% 100.00%

Publicador:

Resumo:

In broad sense, Project Financing1 as a mean of financing large scale infrastructural projects worldwide has had a steady growth in popularity for the last 20 years. This growth has been relatively unscathed from most economic cycles. However in the wake of the 2007 systemic Financial Crisis, Project Financing was also in trouble. The liquidity freeze and credit crunch that ensued affected all parties involved. Traditional Lenders, of this type of financial instrument, locked-in long-term contractual obligations, were severely hit with scarcity of funding compounded by rapidly increasing cost of funding. All the while, Banks were “rescued” by the concerted actions of Central Banks and other Multi-Lateral Agencies around the world but at the same time “stressed” by upcoming regulatory effort (Basel Committee). This impact resulted in specific changes to this type of long-term financing. Changes such as Commercial Banks’ increased risk aversion; pricing increase and maturities decrease of credit facilities; enforcement of Market Disruption Event clauses; partial responsibility for project risk by Multilateral Agencies; and adoption of utility-like availability payments in other industrial sectors such as transportation and even social infrastructure. To the extent possible, this report is then divided in three parts. First, it begins with a more instructional part, touching academic literature (theory) and giving the Banks perspective (practice), but mostly as an overview of Project Finance for awareness’ sake. The renowned Harvard Business School professor – Benjamin Esty, states2 that Project Finance is a “relatively unexplored territory for both empirical and theoretical research” which means that academic research efforts are lagging the practice of Project Finance. Second, the report presents a practical case regarding the first Road Concession in Portugal in 1998 ending with the lessons learned 10 years after Financial Close. Lastly, the report concludes with the analysis of the current trends and changes to the industry post Financial Crisis of the late 2000’s. To achieve this I’ll reference relevant papers, books on the subject, online articles and my own experience in the Project Finance Department at a major Portuguese Investment Bank. Regarding the latter, with the signing of a confidentiality agreement, I’m duly omitting sensitive and proprietary bank information.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Current computer systems have evolved from featuring only a single processing unit and limited RAM, in the order of kilobytes or few megabytes, to include several multicore processors, o↵ering in the order of several tens of concurrent execution contexts, and have main memory in the order of several tens to hundreds of gigabytes. This allows to keep all data of many applications in the main memory, leading to the development of inmemory databases. Compared to disk-backed databases, in-memory databases (IMDBs) are expected to provide better performance by incurring in less I/O overhead. In this dissertation, we present a scalability study of two general purpose IMDBs on multicore systems. The results show that current general purpose IMDBs do not scale on multicores, due to contention among threads running concurrent transactions. In this work, we explore di↵erent direction to overcome the scalability issues of IMDBs in multicores, while enforcing strong isolation semantics. First, we present a solution that requires no modification to either database systems or to the applications, called MacroDB. MacroDB replicates the database among several engines, using a master-slave replication scheme, where update transactions execute on the master, while read-only transactions execute on slaves. This reduces contention, allowing MacroDB to o↵er scalable performance under read-only workloads, while updateintensive workloads su↵er from performance loss, when compared to the standalone engine. Second, we delve into the database engine and identify the concurrency control mechanism used by the storage sub-component as a scalability bottleneck. We then propose a new locking scheme that allows the removal of such mechanisms from the storage sub-component. This modification o↵ers performance improvement under all workloads, when compared to the standalone engine, while scalability is limited to read-only workloads. Next we addressed the scalability limitations for update-intensive workloads, and propose the reduction of locking granularity from the table level to the attribute level. This further improved performance for intensive and moderate update workloads, at a slight cost for read-only workloads. Scalability is limited to intensive-read and read-only workloads. Finally, we investigate the impact applications have on the performance of database systems, by studying how operation order inside transactions influences the database performance. We then propose a Read before Write (RbW) interaction pattern, under which transaction perform all read operations before executing write operations. The RbW pattern allowed TPC-C to achieve scalable performance on our modified engine for all workloads. Additionally, the RbW pattern allowed our modified engine to achieve scalable performance on multicores, almost up to the total number of cores, while enforcing strong isolation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Flexible forms of work like project work are gaining importance in industry and services. Looking at the research on project work, the vast majority of present literature is on project management, but increasingly, problems concerning the quality of work and the efficiency of project teams become visible. The question now is how project work can be structured in order to simultaneously provide efficient and flexible work and healthy working conditions ensuring the development of human resources for a long time. Selected results of publicly funded research into project work will be presented based on case studies in 7 software development /IT consulting project teams (N=34). A set of different methods was applied: interviews with management/project managers, group interviews on work constraints, a monthly diary about well-being and critical incidences in the course of the project, and a final evaluation questionnaire on project outcomes focusing on economic and health aspects. Findings reveal that different types of projects exist with varying degree of team members’ autonomy and influence on work structuring. An effect of self-regulation on mental strain could not be found. The results emphasize, that contradicting requirements and insufficient organizational resources with respect to the work requirements lead to an increased work intensity or work obstruction. These contradicting requirements are identified as main drivers for generating stress. Finally, employees with high values on stress for more than 2 months have significantly higher exhaustion rates than those with only one month peaks. Structuring project work and taking into account the dynamics of project work, there is a need for an active role of the project team in contract negotiation or the detailed definition of work – this is not only a question of individual autonomy but of negotiation the range of option for work structuring. Therefore, along with the sequential definition of the (software) product, the working conditions need to be re-defined.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The extraction of relevant terms from texts is an extensively researched task in Text- Mining. Relevant terms have been applied in areas such as Information Retrieval or document clustering and classification. However, relevance has a rather fuzzy nature since the classification of some terms as relevant or not relevant is not consensual. For instance, while words such as "president" and "republic" are generally considered relevant by human evaluators, and words like "the" and "or" are not, terms such as "read" and "finish" gather no consensus about their semantic and informativeness. Concepts, on the other hand, have a less fuzzy nature. Therefore, instead of deciding on the relevance of a term during the extraction phase, as most extractors do, I propose to first extract, from texts, what I have called generic concepts (all concepts) and postpone the decision about relevance for downstream applications, accordingly to their needs. For instance, a keyword extractor may assume that the most relevant keywords are the most frequent concepts on the documents. Moreover, most statistical extractors are incapable of extracting single-word and multi-word expressions using the same methodology. These factors led to the development of the ConceptExtractor, a statistical and language-independent methodology which is explained in Part I of this thesis. In Part II, I will show that the automatic extraction of concepts has great applicability. For instance, for the extraction of keywords from documents, using the Tf-Idf metric only on concepts yields better results than using Tf-Idf without concepts, specially for multi-words. In addition, since concepts can be semantically related to other concepts, this allows us to build implicit document descriptors. These applications led to published work. Finally, I will present some work that, although not published yet, is briefly discussed in this document.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The momentum and carry anomalies have been extensively documented in the literature. However, there are still many issues relating to the risks associated to them that are left unexplained. One is the fact that an investor holds for too long the most volatile assets, both under momentum and carry strategies. Therefore, they present a level of risk and a probability of extreme events to happen inconsistent. This work project hypothesizes and proves the introduction of risk parity rules on the weights of the portfolios do increase risk rewarding of carry strategies. However, it fails under momentum strategies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The main goals for the current dissertation is to research on how practices and concepts from Agile Project Management can be applied in a non-IT context and to discover which aspects should be considered when deciding if whether an Agile approach should be implemented or not. Previous studies reflect on the adoption for the identified context. However, the recognition of these practices and concepts by the Project Management field of studies still remains unresolved. The adoption of Agile Project Management emerges as a manifestation against traditional approaches, mainly due to their inability of accepting requirements’ changes. Therefore, these practices and concepts can be considered in order to reduce the risks concerning the increase of competition and innovation – which does not apply to the IT sector solely. The current study reviews the literature on Agile Project Management and its adoption across different sectors in order to assess which practices and concepts can be applied on a non-IT context. Nine different methods are reviewed, where two of these show a higher relevance – Scrum and Extreme Programming. The identified practices and concepts can be separated into four different groups: Cultural and Organizational Structures, Process, Practices, and Artefacts. A framework based on the work by Boehm & Turner in 2004 is developed in order to support the decision of adopting agile methods. A survey intended for project managers was carried in order to assess the implementation of the identified practices and concepts and to evaluate which variables have the highest importance on the developed decision support framework. It is concluded that New Product Development is the project type with the highest potential to implement an agile approach and that the Project Final Product’s Innovativeness, Competitiveness, and the Project Member’s Experience and Autonomy are the most important aspects to consider an implementation of an Agile approach.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

RESUMO - A cidade do Seixal, enquanto Cidade Saudável tem como missão a implementação dos princípios e estratégias do Projecto Cidades Saudáveis da OMS. Para tal, desenvolve programas e acções, conjuntamente com parceiros intersectoriais, com vista à melhoria da saúde e da qualidade de vida dos cidadãos residentes na cidade do Seixal, e que promova em simultâneo a participação da comunidade. A separação selectiva dos resíduos depende da participação dos cidadãos, pelo que esta investigação propõe-se estudar quais os factores favoráveis e desfavoráveis à adesão dos cidadãos à separação selectiva de resíduos sólidos urbanos, na cidade do Seixal, enquanto Cidade Saudável. O paradigma quantitativo foi o escolhido para guiar o desenvolvimento deste estudo, através do método survey (descrição numérica de uma fracção da população – amostra – através do processo de aplicação de questionários à população). O questionário desenvolvido para esta investigação será aplicado no Fórum Municipal do Seixal a uma amostra de 250 cidadãos residentes na cidade do Seixal. Para obter os resultados finais, será realizada, numa primeira fase, uma análise descritiva de todas as variáveis, que deverá incluir medidas de localização e variabilidade adequadas a cada variável. Numa segunda fase será realizada uma análise inferencial recorrendo a testes não paramétricos e paramétricos. ----------------ABSTRAT - The city of Seixal, while Healthy City, takes as a mission the implementation of the beginnings and strategies of the WHO Healthy Cities Project. Programs and actions are being developed for that purpose, jointly with intersectorial partners, with sight to the improvement of the health and of the quality of life of the resident citizens in the city of Seixal, and that it promotes in simultaneous the participation of the community. Because the selective waste separation depends on the participation of the citizens, the purpose of this investigation is to study the favorable and unfavorable factors which leads participation or not of the citizens to the selective separation of urbane solid wastes, in the city of Seixal, while Healthy City. The quantitative paradigm was the chosen one to guide the development of this study, through the survey method (numerical description of a fraction of the population – sample – through the process of questionnaires to the population). The questionnaire developed for this investigation will be applied in the Fórum Municipal do Seixal, to a sample of 250 resident citizens in the city of Seixal. To obtain the final results, we will carry out, in a first phase, a descriptive analysis of all the variables, which will include measures of location and variability appropriate to each one. In a second phase will be carried out an inferential analysis resorting to non parametric and parametric tests.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Dissertação apresentada na Faculdade de Ciências e Tecnologias da Universidade Nova de Lisboa para a obtenção do grau de Mestre em Engenharia Electrotécnica e Computadores

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Based on the report for Project III of the PhD programme on Technology Assessment and prepared for the Winter School that took place at Universidade Nova de Lisboa, Caparica Campus on the 6th and 7th of December 2010.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Dissertation presented to obtain the Ph.D degree in Biology

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Dissertation submitted in partial fulfillment of the requirements for the Degree of Master of Science in Geospatial Technologies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A Work Project, presented as part of the requirements for the Award of a Masters Degree in Management from the NOVA – School of Business and Economics

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The summer school “Renewable Energy Systems: Role and Use of Parliamentary Technology Assessment” was the first European Summer School with a pure focus on technology assessment. The aim of the three-day long summer school of the European project Parliaments and Civil Society in Technology Assessment (PACITA) was to create awareness of the potential of technology groups in Europe. Therefore, the summer school involved keynotes, practical exercises, mutual reflection, cutting edge training and networking to deal with the theme of renewable energy systems out of the perspective of Technology Assessment (TA), to meet transition objectives or to critically assess energy technologies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

RESUMO: As Análises Clínicas são um precioso elemento entre os meios complementares de diagnóstico e terapêutica permitindo uma enorme panóplia de informações sobre o estado de saúde de determinado utente. O objetivo do laboratório é fornecer informação analítica sobre as amostras biológicas, sendo esta caracterizada pela sua fiabilidade, relevância e facultada em tempo útil. Assim, tratando-se de saúde, e mediante o propósito do laboratório, é notória a sua importância, bem como, a dos fatores associados para o cumprimento do mesmo. O bom desenrolar do ciclo laboratorial, compreendido pelas fases pré-analítica, analítica e pós-analítica é crucial para que o objetivo do laboratório seja cumprido com rigor e rapidez. O presente trabalho “O Erro na Fase Pré-Analítica: Amostras Não Conformes versus Procedimentos”, enquadrado no mestrado de Qualidade e Organização no Laboratório de Análises Clínicas, pretendeu enfatizar a importância da fase pré- analítica, sendo ela apontada como a primordial em erros que acabam por atrasar a saída de resultados ou por permitir que os mesmos não sejam fidedignos como se deseja, podendo acarretar falsos diagnósticos e decisões clínicas erradas. Esta fase, iniciada no pedido médico e finalizada com a chegada das amostras biológicas ao laboratório está entregue a uma diversidade de procedimentos que acarretam, por si só, uma grande diversidade de intervenientes, para além de uma variabilidade de factores que influenciam a amostra e seus resultados. Estes fatores, que podem alterar de algum modo a “veracidade” dos resultados analíticos, devem ser identificados e tidos em consideração para que estejamos convitos que os resultados auxiliam diagnósticos precisos e uma avaliação correta do estado do utente. As colheitas que por quaisquer divergências não originam amostras que cumpram o objectivo da sua recolha, não estando por isso em conformidade com o pretendido, constituem uma importante fonte de erro para esta fase pré-analítica. Neste estudo foram consultados os dados relativos a amostras de sangue e urina não conformes detetadas no laboratório, em estudo, durante o 1º trimestre de 2012, para permitir conhecer o tipo de falhas que acontecem e a sua frequência. Aos Técnicos de Análises Clínicas, colaboradores do laboratório, foi-lhes pedido que respondessem a um questionário sobre os seus procedimentos quotidianos e constituíssem, assim, a população desta 2ª parte do projeto. Preenchido e devolvido de forma anónima, este questionário pretendeu conhecer os procedimentos na tarefa de executar colheitas e, hipoteticamente, confrontá-los com as amostras não conformes verificadas. No 1ºsemestre de 2012 e num total de 25319 utentes registaram-se 146 colheitas que necessitaram de repetição por se verificarem não conformes. A “amostra não colhida” foi a não conformidade mais frequente (50%) versus a “má identificação” que registou somente 1 acontecimento. Houve ainda não conformidades que não se registaram como “preparação inadequada” e “amostra mal acondicionada”. Os técnicos revelaram-se profissionais competentes, conhecedores das tarefas a desempenhar e preocupados em executá-las com qualidade. Eliminar o erro não estará, seguramente, ao nosso alcance porém admitir a sua presença, detetá-lo e avaliar a sua frequência fará com que possamos diminuir a sua existência e melhorar a qualidade na fase pré-analítica, atribuindo-lhe a relevância que desempenha no processo laboratorial.-----------ABSTRACT:Clinical analyses are a precious element among diagnostic and therapeutic tests as they allow an enormous variety of information on the state of health of a user. The aim of the laboratory is to supply reliable, relevant and timely analytical information on biological samples. In health-related matters, in accordance with the objective of the laboratory, their importance is vital, as is the assurance that all the tools are in place for the fulfillment of its purpose. A good laboratory cycle, which includes the pre-analytical, analytical and post-analytical phases, is crucial in fulfilling the laboratory’s mission rapidly and efficiently. The present work - "Error in the pre-analytical phase: non-compliant samples versus procedures”, as part of the Master’s in Quality and Organization in the Clinical Analyses Laboratory, wishes to emphasize the importance of the pre-analytical phase, as the phase containing most errors which eventually lead to delays in the issue of results, or the one which enables those results not to be as reliable as desired, which can lead to false diagnosis and wrong clinical decisions. This phase, which starts with the medical request and ends with the arrival of the biological samples to the laboratory, entails a variety of procedures, which require the intervention of different players, not to mention a great number of factors, which influence the sample and the results. These factors, capable of somehow altering the “truth” of the analytical results, must be identified and taken into consideration so that we may ensure that the results help to make precise diagnoses and a correct evaluation of the user’s condition. Those collections which, due to any type of differences, do not originate samples capable of fulfilling their purpose, and are therefore not compliant with the objective, constitute an important source of error in this pre-analytical phase. In the present study, we consulted data from non-compliant blood and urine samples, detected at the laboratory during the 1st quarter of 2012, to find out the type of faults that happen and their frequency. The clinical analysis technicians working at the laboratory were asked to fill out a questionnaire regarding their daily procedures, forming in this way the population for this second part of the project. Completed and returned anonymously, this questionnaire intended to investigate the procedures for collections and, hypothetically, confront them with the verified non-compliant samples. In the first semester of 2012, and out of a total of 25319 users, 146 collections had to be repeated due to non-compliance. The “uncollected sample” was the most frequent non-compliance (>50%) versus “incorrect identification” which had only one occurrence. There were also unregistered non-compliance issues such as “inadequate preparation” and “inappropriately packaged sample”. The technicians proved to be competent professionals, with knowledge of the tasks they have to perform and eager to carry them out efficiently. We will certainly not be able to eliminate error, but recognizing its presence, detecting it and evaluating its frequency will help to decrease its occurrence and improve quality in the pre-analytical phase, giving it the relevance it has within the laboratory process.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Corporate world is becoming more and more competitive. This leads organisations to adapt to this reality, by adopting more efficient processes, which result in a decrease in cost as well as an increase of product quality. One of these processes consists in making proposals to clients, which necessarily include a cost estimation of the project. This estimation is the main focus of this project. In particular, one of the goals is to evaluate which estimation models fit the Altran Portugal software factory the most, the organization where the fieldwork of this thesis will be carried out. There is no broad agreement about which is the type of estimation model more suitable to be used in software projects. Concerning contexts where there is plenty of objective information available to be used as input to an estimation model, model-based methods usually yield better results than the expert judgment. However, what happens more frequently is not having this volume and quality of information, which has a negative impact in the model-based methods performance, favouring the usage of expert judgement. In practice, most organisations use expert judgment, making themselves dependent on the expert. A common problem found is that the performance of the expert’s estimation depends on his previous experience with identical projects. This means that when new types of projects arrive, the estimation will have an unpredictable accuracy. Moreover, different experts will make different estimates, based on their individual experience. As a result, the company will not directly attain a continuous growing knowledge about how the estimate should be carried. Estimation models depend on the input information collected from previous projects, the size of the project database and the resources available. Altran currently does not store the input information from previous projects in a systematic way. It has a small project database and a team of experts. Our work is targeted to companies that operate in similar contexts. We start by gathering information from the organisation in order to identify which estimation approaches can be applied considering the organization’s context. A gap analysis is used to understand what type of information the company would have to collect so that other approaches would become available. Based on our assessment, in our opinion, expert judgment is the most adequate approach for Altran Portugal, in the current context. We analysed past development and evolution projects from Altran Portugal and assessed their estimates. This resulted in the identification of common estimation deviations, errors, and patterns, which lead to the proposal of metrics to help estimators produce estimates leveraging past projects quantitative and qualitative information in a convenient way. This dissertation aims to contribute to more realistic estimates, by identifying shortcomings in the current estimation process and supporting the self-improvement of the process, by gathering as much relevant information as possible from each finished project.