982 resultados para reflective versus formative models
Resumo:
A finalidade da investigação é inquirir o modo como o problema do nihilismo se acha concebido no confronto entre Heidegger e Jünger. Para esse efeito, toma-se como ponto de partida dois textos: por um lado, o ensaio de Jünger intitulado Über die Linie, publicado em 1949, na Festschrift dedicada a Heidegger por ocasião dos seus sessenta anos; por outro lado, a “resposta” dada por Heidegger, em 1955, no seu contributo para uma Festschrift análoga, dedicada a Jünger – “resposta” originalmente publicada com o título Über »die Linie« (e posteriormente republicada com o título Zur Seinsfrage).
Resumo:
This paper studies the performance of two different Risk Parity strategies, one from Maillard (2008) and a “naïve” that was already used by market practitioners, against traditional strategies. The tests will compare different regions (US, UK, Germany and Japan) since 1991 to 2013, and will use different ways of volatility. The main findings are that Risk Parity outperforms any traditional strategy, and the “true” (by Maillard) has considerable better results than the “naïve” when using historical volatility, while using EWMA there are significant differences.
Resumo:
This paper analyses the boundaries of simplified wind turbine models used to represent the behavior of wind turbines in order to conduct power system stability studies. Based on experimental measurements, the response of recent simplified (also known as generic) wind turbine models that are currently being developed by the International Standard IEC 61400-27 is compared to complex detailed models elaborated by wind turbine manufacturers. This International Standard, whose Technical Committee was convened in October 2009, is focused on defining generic simulation models for both wind turbines (Part 1) and wind farms (Part 2). The results of this work provide an improved understanding of the usability of generic models for conducting power system simulations.
Resumo:
The development of human cell models that recapitulate hepatic functionality allows the study of metabolic pathways involved in toxicity and disease. The increased biological relevance, cost-effectiveness and high-throughput of cell models can contribute to increase the efficiency of drug development in the pharmaceutical industry. Recapitulation of liver functionality in vitro requires the development of advanced culture strategies to mimic in vivo complexity, such as 3D culture, co-cultures or biomaterials. However, complex 3D models are typically associated with poor robustness, limited scalability and compatibility with screening methods. In this work, several strategies were used to develop highly functional and reproducible spheroid-based in vitro models of human hepatocytes and HepaRG cells using stirred culture systems. In chapter 2, the isolation of human hepatocytes from resected liver tissue was implemented and a liver tissue perfusion method was optimized towards the improvement of hepatocyte isolation and aggregation efficiency, resulting in an isolation protocol compatible with 3D culture. In chapter 3, human hepatocytes were co-cultivated with mesenchymal stem cells (MSC) and the phenotype of both cell types was characterized, showing that MSC acquire a supportive stromal function and hepatocytes retain differentiated hepatic functions, stability of drug metabolism enzymes and higher viability in co-cultures. In chapter 4, a 3D alginate microencapsulation strategy for the differentiation of HepaRG cells was evaluated and compared with the standard 2D DMSO-dependent differentiation, yielding higher differentiation efficiency, comparable levels of drug metabolism activity and significantly improved biosynthetic activity. The work developed in this thesis provides novel strategies for 3D culture of human hepatic cell models, which are reproducible, scalable and compatible with screening platforms. The phenotypic and functional characterization of the in vitro systems performed contributes to the state of the art of human hepatic cell models and can be applied to the improvement of pre-clinical drug development efficiency of the process, model disease and ultimately, development of cell-based therapeutic strategies for liver failure.
Resumo:
INTRODUCTION: In Colombia, there are no published studies for the treatment of uncomplicated Plasmodium falciparum malaria comparing artemisinin combination therapies. Hence, it is intended to demonstrate the non-inferior efficacy/safety profiles of artesunate + amodiaquine versus artemether-lumefantrine treatments. METHODS: A randomized, controlled, open-label, noninferiority (Δ≤5%) clinical trial was performed in adults with uncomplicated P. falciparum malaria using the 28‑day World Health Organization validated design/definitions. Patients were randomized 1:1 to either oral artesunate + amodiaquine or artemether-lumefantrine. The primary efficacy endpoint: adequate clinical and parasitological response; secondary endpoints: - treatment failures defined per the World Health Organization. Safety: assessed through adverse events. RESULTS: A total of 105 patients was included in each group: zero censored observations. Mean (95%CI - Confidence interval) adequate clinical and parasitological response rates: 100% for artesunate + amodiaquine and 99% for artemether-lumefantrine; the noninferiority criteria was met (Δ=1.7%). There was one late parasitological therapeutic failure (1%; artemether-lumefantrine group), typified by polymerase chain reaction as the MAD20 MSP1 allele. The fever clearance time (artesunate + amodiaquine group) was significantly shorter (p=0.002). Respectively, abdominal pain for artesunate + amodiaquine and artemether-lumefantrine was 1.9% and 3.8% at baseline (p=0.68) and 1% and 13.3% after treatment (p<0.001). CONCLUSIONS: Uncomplicated P. falciparum malaria treatment with artesunate + amodiaquine is noninferior to the artemether-lumefantrine standard treatment. The efficacy/safety profiles grant further studies in this and similar populations.
Resumo:
Following orders, hierarchical obedience and military discipline are essential values for the survival of the armed forces. Without them, it is not possible to conceive the armed forces as an essential pillar of a democratic state of law and a guarantor of national independence. As issuing orders as well as receiving and following them are inextricably linked to military discipline, and as such injunctions entail the workings of a particular obedience regime within the specific kind of organized power framework which is the Armed Forces, only by analysing the importance of such orders within this microcosm – with its strict hierarchical structure – will it be possible to understand which criminal judicial qualification to ascribe to the individual at the rear by reference to the role of the front line individual (i.e. the one who issues an order vs the one who executes it). That is, of course, when we are faced with the practice of unlawful acts, keeping in mind the organizational framework and its influence over the will of the executor. One thing we take as read, if the orders can be described as unlawful, the boundary line of the duty of obedience, which cannot be overstepped, both because of a legal as well as a constitutional imperative, will have been crossed. And the military have sworn an oath of obedience to the fundamental law. The topic of hierarchical obedience cannot be separated from the analysis of current legislation which pertains to the topic within military institutions. With that in mind, it appeared relevant to address the major norms which regulate the matter within the Portuguese military legal system, and, whenever necessary and required by the reality under analysis, to relate that to civilian law or legal doctrine.
Resumo:
This paper develops the model of Bicego, Grosso, and Otranto (2008) and applies Hidden Markov Models to predict market direction. The paper draws an analogy between financial markets and speech recognition, seeking inspiration from the latter to solve common issues in quantitative investing. Whereas previous works focus mostly on very complex modifications of the original hidden markov model algorithm, the current paper provides an innovative methodology by drawing inspiration from thoroughly tested, yet simple, speech recognition methodologies. By grouping returns into sequences, Hidden Markov Models can then predict market direction the same way they are used to identify phonemes in speech recognition. The model proves highly successful in identifying market direction but fails to consistently identify whether a trend is in place. All in all, the current paper seeks to bridge the gap between speech recognition and quantitative finance and, even though the model is not fully successful, several refinements are suggested and the room for improvement is significant.
Resumo:
The life of humans and most living beings depend on sensation and perception for the best assessment of the surrounding world. Sensorial organs acquire a variety of stimuli that are interpreted and integrated in our brain for immediate use or stored in memory for later recall. Among the reasoning aspects, a person has to decide what to do with available information. Emotions are classifiers of collected information, assigning a personal meaning to objects, events and individuals, making part of our own identity. Emotions play a decisive role in cognitive processes as reasoning, decision and memory by assigning relevance to collected information. The access to pervasive computing devices, empowered by the ability to sense and perceive the world, provides new forms of acquiring and integrating information. But prior to data assessment on its usefulness, systems must capture and ensure that data is properly managed for diverse possible goals. Portable and wearable devices are now able to gather and store information, from the environment and from our body, using cloud based services and Internet connections. Systems limitations in handling sensorial data, compared with our sensorial capabilities constitute an identified problem. Another problem is the lack of interoperability between humans and devices, as they do not properly understand human’s emotional states and human needs. Addressing those problems is a motivation for the present research work. The mission hereby assumed is to include sensorial and physiological data into a Framework that will be able to manage collected data towards human cognitive functions, supported by a new data model. By learning from selected human functional and behavioural models and reasoning over collected data, the Framework aims at providing evaluation on a person’s emotional state, for empowering human centric applications, along with the capability of storing episodic information on a person’s life with physiologic indicators on emotional states to be used by new generation applications.
Resumo:
Natural disasters are events that cause general and widespread destruction of the built environment and are becoming increasingly recurrent. They are a product of vulnerability and community exposure to natural hazards, generating a multitude of social, economic and cultural issues of which the loss of housing and the subsequent need for shelter is one of its major consequences. Nowadays, numerous factors contribute to increased vulnerability and exposure to natural disasters such as climate change with its impacts felt across the globe and which is currently seen as a worldwide threat to the built environment. The abandonment of disaster-affected areas can also push populations to regions where natural hazards are felt more severely. Although several actors in the post-disaster scenario provide for shelter needs and recovery programs, housing is often inadequate and unable to resist the effects of future natural hazards. Resilient housing is commonly not addressed due to the urgency in sheltering affected populations. However, by neglecting risks of exposure in construction, houses become vulnerable and are likely to be damaged or destroyed in future natural hazard events. That being said it becomes fundamental to include resilience criteria, when it comes to housing, which in turn will allow new houses to better withstand the passage of time and natural disasters, in the safest way possible. This master thesis is intended to provide guiding principles to take towards housing recovery after natural disasters, particularly in the form of flood resilient construction, considering floods are responsible for the largest number of natural disasters. To this purpose, the main structures that house affected populations were identified and analyzed in depth. After assessing the risks and damages that flood events can cause in housing, a methodology was proposed for flood resilient housing models, in which there were identified key criteria that housing should meet. The same methodology is based in the US Federal Emergency Management Agency requirements and recommendations in accordance to specific flood zones. Finally, a case study in Maldives – one of the most vulnerable countries to sea level rise resulting from climate change – has been analyzed in light of housing recovery in a post-disaster induced scenario. This analysis was carried out by using the proposed methodology with the intent of assessing the resilience of the newly built housing to floods in the aftermath of the 2004 Indian Ocean Tsunami.
Resumo:
This research is titled “The Future of Airline Business Models: Which Will Win?” and it is part of the requirements for the award of a Masters in Management from NOVA BSE and another from Luiss Guido Carlo University. The purpose is to elaborate a complete market analysis of the European Air Transportation Industry in order to predict which Airlines, strategies and business models may be successful in the next years. First, an extensive literature review of the business model concept has been done. Then, a detailed overview of the main European Airlines and the strategies that they have been implementing so far has been developed. Finally, the research is illustrated with three case studies
Resumo:
A medição da Velocidade da Onda de Pulso (VOP), enquanto meio complementar de diagnóstico e tratamento das doenças cardiovasculares, é considerada um marcador precoce de compromisso arterial em diversos contextos clínicos. O seu papel consolida-se na prevenção primária da patologia arterial, através da avaliação das propriedades mecânicas dos vasos sanguíneos, nomeadamente, quanto à sua rigidez e distensibilidade. Neste trabalho pretende-se comparar os valores experimentais da VOP obtidos com dois métodos de medição diferentes. Para a obtenção das ondas de pulso, utilizaram-se dois equipamentos não-invasivos que permitem uma avaliação in-vivo das propriedades mecânicas dos vasos sanguíneos. Estes distinguem-se pelo princípio físico em que se baseiam: o Complior usa sensores de pressão, baseando-se no princípio piezoeléctrico e o protótipo do equipamento VasoCheck, desenvolvido no nosso grupo da NMT, usa sensores ópticos de infra-vermelho, baseando-se no princípio fotopletismográfico (FPG). As medições com os dois métodos realizaram-se em ambiente controlado, numa amostra de 42 voluntários dos 20 aos 30 anos de idade, de ambos os sexos, saudáveis e sem diagnóstico cardiovascular associado. Os dados experimentais obtidos com os métodos e sensores acima enunciados foram tratados estatisticamente. Os valores da VOP obtidos com o método FPG mostraram-se consistentes e em boa concordância com os obtidos com o método piezoeléctrico. O método FPG aparenta ser mais rápido na sua utilização. Futuramente, evidencia-se importante repetir a série de medições com os dois métodos numa amostra maior, estendendo-a também a voluntários com diagnóstico cardiovascular conhecido. Com base neste trabalho foi apresentada a comunicação painel “Análise comparativa de dois métodos de medição da Velocidade da Onda de Pulso” na International Conference on Health Technology Assessment and Quality Management 2012, realizada a 3 e 4 de Fevereiro de 2012 na Escola Superior de Tecnologia da Saúde de Lisboa.
Resumo:
Dissertação de Mestrado apresentada ao ISPA - Instituto Universitário
Resumo:
The growing significance of companies that conduct international business at or near their funding has been critically challenging previous incremental models of international expansion. This thesis aims to cross-compare, by means of a multiple-case study, eight Portuguese start-ups among themselves and with the theoretical concept of born global firms versus traditional ones. This work project finds that: (1) active entrepreneurs with global vision from inception are essential for the implementation of international strategies; (2) only formal network plays a key role for successful internationalization and (3) inimitable sources of value creation, niche-focused strategies and unique intangible assets are also crucial.
Resumo:
Abstract: INTRODUCTION: The treatment of individuals with active tuberculosis (TB) and the identification and treatment of latent tuberculosis infection (LTBI) contacts are the two most important strategies for the control of TB. The objective of this study was compare the performance of tuberculin skin testing (TST) with QuantiFERON-TB Gold In TUBE(r) in the diagnosis of LTBI in contacts of patients with active TB. METHODS: Cross-sectional analytical study with 60 contacts of patients with active pulmonary TB. A blood sample of each contact was taken for interferon-gamma release assay (IGRA) and subsequently performed the TST. A receiver operating characteristic curve was generated to assess the cutoff points and the sensitivity, predictive values, and accuracy were calculated. The agreement between IGRA and TST results was evaluated by Kappa coefficient. RESULTS: Here, 67.9% sensitivity, 84.4% specificity, 79.1% PPV, 75% NPV, and 76.7% accuracy were observed for the 5mm cutoff point. The prevalence of LTBI determined by TST and IGRA was 40% and 46.7%, respectively. CONCLUSIONS: Both QuantiFERON-TB Gold In TUBE(r) and TST showed good performance in LTBI diagnosis. The creation of specific diagnostic methods is necessary for the diagnosis of LTBI with higher sensitivity and specificity, preferably with low cost and not require a return visit for reading because with early treatment of latent forms can prevent active TB.