901 resultados para Formal Methods. Component-Based Development. Competition. Model Checking
Resumo:
Dissertação de mestrado integrado em Engenharia Mecânica
Resumo:
Dissertação de mestrado integrado em Engenharia Civil
Resumo:
In this work we report four different destructive and non-destructive methods for detecting picorna-like virus particles in triatomines. The methods are based on direct observation under transmission electron microscope and they consist of four ways to prepare samples of presumable infected material. The samples are prepared processing dead or alive insect parts, or even dry or fresh insect feces. The methods can be used as analytical or preparative techniques, for quantifying virus infection and checking virus integrity as well. In this work the four methods are applied in order to detect Triatoma virus (TrV) particles in T. infestans colonies.
Resumo:
Glucose supply from blood to brain occurs through facilitative transporter proteins. A near linear relation between brain and plasma glucose has been experimentally determined and described by a reversible model of enzyme kinetics. A conformational four-state exchange model accounting for trans-acceleration and asymmetry of the carrier was included in a recently developed multi-compartmental model of glucose transport. Based on this model, we demonstrate that brain glucose (G(brain)) as function of plasma glucose (G(plasma)) can be described by a single analytical equation namely comprising three kinetic compartments: blood, endothelial cells and brain. Transport was described by four parameters: apparent half saturation constant K(t), apparent maximum rate constant T(max), glucose consumption rate CMR(glc), and the iso-inhibition constant K(ii) that suggests G(brain) as inhibitor of the isomerisation of the unloaded carrier. Previous published data, where G(brain) was quantified as a function of plasma glucose by either biochemical methods or NMR spectroscopy, were used to determine the aforementioned kinetic parameters. Glucose transport was characterized by K(t) ranging from 1.5 to 3.5 mM, T(max)/CMR(glc) from 4.6 to 5.6, and K(ii) from 51 to 149 mM. It was noteworthy that K(t) was on the order of a few mM, as previously determined from the reversible model. The conformational four-state exchange model of glucose transport into the brain includes both efflux and transport inhibition by G(brain), predicting that G(brain) eventually approaches a maximum concentration. However, since K(ii) largely exceeds G(plasma), iso-inhibition is unlikely to be of substantial importance for plasma glucose below 25 mM. As a consequence, the reversible model can account for most experimental observations under euglycaemia and moderate cases of hypo- and hyperglycaemia.
Resumo:
BACKGROUND: The goals of our study are to determine the most appropriate model for alcohol consumption as an exposure for burden of disease, to analyze the effect of the chosen alcohol consumption distribution on the estimation of the alcohol Population- Attributable Fractions (PAFs), and to characterize the chosen alcohol consumption distribution by exploring if there is a global relationship within the distribution. METHODS: To identify the best model, the Log-Normal, Gamma, and Weibull prevalence distributions were examined using data from 41 surveys from Gender, Alcohol and Culture: An International Study (GENACIS) and from the European Comparative Alcohol Study. To assess the effect of these distributions on the estimated alcohol PAFs, we calculated the alcohol PAF for diabetes, breast cancer, and pancreatitis using the three above-named distributions and using the more traditional approach based on categories. The relationship between the mean and the standard deviation from the Gamma distribution was estimated using data from 851 datasets for 66 countries from GENACIS and from the STEPwise approach to Surveillance from the World Health Organization. RESULTS: The Log-Normal distribution provided a poor fit for the survey data, with Gamma and Weibull distributions providing better fits. Additionally, our analyses showed that there were no marked differences for the alcohol PAF estimates based on the Gamma or Weibull distributions compared to PAFs based on categorical alcohol consumption estimates. The standard deviation of the alcohol distribution was highly dependent on the mean, with a unit increase in alcohol consumption associated with a unit increase in the mean of 1.258 (95% CI: 1.223 to 1.293) (R2 = 0.9207) for women and 1.171 (95% CI: 1.144 to 1.197) (R2 = 0. 9474) for men. CONCLUSIONS: Although the Gamma distribution and the Weibull distribution provided similar results, the Gamma distribution is recommended to model alcohol consumption from population surveys due to its fit, flexibility, and the ease with which it can be modified. The results showed that a large degree of variance of the standard deviation of the alcohol consumption Gamma distribution was explained by the mean alcohol consumption, allowing for alcohol consumption to be modeled through a Gamma distribution using only average consumption.
Resumo:
We use a dynamic monopolistic competition model to show that an economythat inherits a small range of specialized inputs can be trapped into alower stage of development. The limited availability of specialized inputsforces the final goods producers to use a labor intensive technology, whichin turn implies a small inducement to introduce new intermediate inputs. Thestart--up costs, which make the intermediate inputs producers subject todynamic increasing returns, and pecuniary externalities that result from thefactor substitution in the final goods sector, play essential roles in themodel.
Resumo:
Many workers believe that personal contacts are crucial for obtainingjobs in high-wage sectors. On the other hand, firms in high-wage sectorsreport using employee referrals because they help provide screening andmonitoring of new employees. This paper develops a matching model thatcan explain the link between inter-industry wage differentials and useof employee referrals. Referrals lower monitoring costs because high-effortreferees can exert peer pressure on co-workers, allowing firms to pay lowerefficiency wages. On the other hand, informal search provides fewer job andapplicant contacts than formal methods (e.g., newspaper ads). In equilibrium,the matching process generates segmentation in the labor market becauseof heterogeneity in the size of referral networks. Referrals match good high-paying jobs to well-connected workers, while formal methods matchless attractive jobs to less-connected workers. Industry-level data show apositive correlation between industry wage premia and use of employeereferrals. Moreover, evidence using the NLSY shows similar positive andsignificant OLS and fixed-effects estimates of the returns to employeereferrals, but insignificant effects once sector of employment is controlledfor. This evidence suggests referred workers earn higher wages not becauseof higher unobserved ability or better matches but rather because theyare hired in high-wage sectors.
Resumo:
This paper investigates the role of employee referrals in the labor market.Using an original data set, I find that industries that pay wage premia andhave characteristics associated with high-wage sectors rely mainly on employeereferrals to fill jobs. Moreover, unemployment rates are higher in industries which use employee referrals more extensively. This paper develops an equilibrium matching model which can explain these empirical regularities. Inthis model, the matching process sorts heterogeneous firms and workers into two distinct groups: referrals match "good" jobs to "good" workers, while formalmethods (e.g., newspaper ads and employment agencies) match less-attractive jobs to disadvantaged workers. Thus, well-connected workers who learn quickly aboutjob opportunities use referrals to jump job queues, while those who are less well placed in the labor market search for jobs through formal methods. The split of firms and workers between referrals and formal search is, however, not necessarily efficient. Congestion externalities in referral search imply that unemployment would be closer to the optimal rate if firms and workers 'at themargin' searched formally.
Resumo:
Computed Tomography (CT) represents the standard imaging modality for tumor volume delineation for radiotherapy treatment planning of retinoblastoma despite some inherent limitations. CT scan is very useful in providing information on physical density for dose calculation and morphological volumetric information but presents a low sensitivity in assessing the tumor viability. On the other hand, 3D ultrasound (US) allows a highly accurate definition of the tumor volume thanks to its high spatial resolution but it is not currently integrated in the treatment planning but used only for diagnosis and follow-up. Our ultimate goal is an automatic segmentation of gross tumor volume (GTV) in the 3D US, the segmentation of the organs at risk (OAR) in the CT and the registration of both modalities. In this paper, we present some preliminary results in this direction. We present 3D active contour-based segmentation of the eye ball and the lens in CT images; the presented approach incorporates the prior knowledge of the anatomy by using a 3D geometrical eye model. The automated segmentation results are validated by comparing with manual segmentations. Then, we present two approaches for the fusion of 3D CT and US images: (i) landmark-based transformation, and (ii) object-based transformation that makes use of eye ball contour information on CT and US images.
Resumo:
Functional neuroimaging has undergone spectacular developments in recent years. Paradoxically, its neurobiological bases have remained elusive, resulting in an intense debate around the cellular mechanisms taking place upon activation that could contribute to the signals measured. Taking advantage of a modeling approach, we propose here a coherent neurobiological framework that not only explains several in vitro and in vivo observations but also provides a physiological basis to interpret imaging signals. First, based on a model of compartmentalized energy metabolism, we show that complex kinetics of NADH changes observed in vitro can be accounted for by distinct metabolic responses in two cell populations reminiscent of neurons and astrocytes. Second, extended application of the model to an in vivo situation allowed us to reproduce the evolution of intraparenchymal oxygen levels upon activation as measured experimentally without substantially altering the initial parameter values. Finally, applying the same model to functional neuroimaging in humans, we were able to determine that the early negative component of the blood oxygenation level-dependent response recorded with functional MRI, known as the initial dip, critically depends on the oxidative response of neurons, whereas the late aspects of the signal correspond to a combination of responses from cell types with two distinct metabolic profiles that could be neurons and astrocytes. In summary, our results, obtained with such a modeling approach, support the concept that both neuronal and glial metabolic responses form essential components of neuroimaging signals.
Resumo:
Työn tarkoituksena oli suunnitella kunnonvalvontajärjestelmä kahdelle lasivillan tuotantolinjalle. Suunnitteluprosessin lisäksi työssä on esitelty erilaisia kunnonvalvontamenetelmiä. Työn alussa on kerrottu erilaisista kunnonvalvontamenetelmistä, joilla voidaan seurata erilaisten laitteiden ja koneiden toimintakuntoa.Erityisesti työssä on tarkasteltu teollisuudessa yleistyviä kunnonvalvonnan värähtelymittauksia. Työssä suunniteltu kunnonvalvontajärjestelmä perustuu viiteen eri menetelmään, jotka ovat värähtelymittaus, lämpötilanmittaus lämpökameralla, lämpötilanmittaus kannettavalla mittarilla, kuuntelu elektronisella stetoskoopilla ja pyörivien osien kunnontarkkailu stroboskoopilla. Kunnonvalvontajärjestelmän suunnittelu on tehty useassa eri vaiheessa. Ensin työssä on kartoitettu tuotannon kannalta tärkeimmät laitteet ja niiden mahdolliset vikaantumistavat. Seuraavaksi on valittu sopivat kunnonvalvontamenetelmät ja tehty mittaussuunnitelma, jossa on esitetty eri laitteille suoritettavat mittaukset ja mittausten aikavälit.Lopuksi työssä on esitelty muutama esimerkkitapaus kunnonvalvontamenetelmien käytöstä sekä kerrottu mahdollisista tulevaisuuden kehitysmahdollisuuksista.
Resumo:
Tämän pro gradu -tutkielman aiheena on yritysverkostojen kehittäminen sekä siihen liittyvät käsitteet ja menetelmät. Tutkielman tavoitteena on selkiyttää verkostoitumiseen, verkostosuhteen kehittymiseen ja yritysverkoston kehittämiseen liittyviä käsitteitä. Tutkielmassa verkostoitumista tarkastellaan transaktiokustannusteorian, resurssiperusteisen näkemyksen ja IMP-tutkimusryhmän vuorovaikutusnäkökulman mukaisesti. Verkostoituminen nähdään osana organisaation uudistumisprosessia ja sitä mallinnetaan verkostosuhteen kehittymisprosessin avulla. Verkoston kehittäminen on jatke organisaation uudistumisprosessille ja osa verkostosuhteen kehittymisprosessia. Verkoston kehittämiseen liittyen tutkielmassa tarkastellaan, miten verkoston kehittäminen eroaa yrityksen sisäisestä kehittämisestä, ja millaisia haasteita ja menestystekijöitä yhteisestä kehittämistoiminnasta on löydettävissä. Tutkielmassa luokitellaan myös kehittämiskohteita sekä tarkastellaan, millaisella kehittämistyön organisoitumismallilla sekä työkaluilla ja menetelmillä yritysverkoston kehittämistä voidaan tukea.
Resumo:
The patent system was created for the purpose of promoting innovation by granting the inventors a legally defined right to exclude others in return for public disclosure. Today, patents are being applied and granted in greater numbers than ever, particularly in new areas such as biotechnology and information andcommunications technology (ICT), in which research and development (R&D) investments are also high. At the same time, the patent system has been heavily criticized. It has been claimed that it discourages rather than encourages the introduction of new products and processes, particularly in areas that develop quickly, lack one-product-one-patent correlation, and in which theemergence of patent thickets is characteristic. A further concern, which is particularly acute in the U.S., is the granting of so-called 'bad patents', i.e. patents that do not factually fulfil the patentability criteria. From the perspective of technology-intensive companies, patents could,irrespective of the above, be described as the most significant intellectual property right (IPR), having the potential of being used to protect products and processes from imitation, to limit competitors' freedom-to-operate, to provide such freedom to the company in question, and to exchange ideas with others. In fact, patents define the boundaries of ownership in relation to certain technologies. They may be sold or licensed on their ownor they may be components of all sorts of technology acquisition and licensing arrangements. Moreover, with the possibility of patenting business-method inventions in the U.S., patents are becoming increasingly important for companies basing their businesses on services. The value of patents is dependent on the value of the invention it claims, and how it is commercialized. Thus, most of them are worth very little, and most inventions are not worth patenting: it may be possible to protect them in other ways, and the costs of protection may exceed the benefits. Moreover, instead of making all inventions proprietary and seeking to appropriate as highreturns on investments as possible through patent enforcement, it is sometimes better to allow some of them to be disseminated freely in order to maximize market penetration. In fact, the ideology of openness is well established in the software sector, which has been the breeding ground for the open-source movement, for instance. Furthermore, industries, such as ICT, that benefit from network effects do not shun the idea of setting open standards or opening up their proprietary interfaces to allow everyone todesign products and services that are interoperable with theirs. The problem is that even though patents do not, strictly speaking, prevent access to protected technologies, they have the potential of doing so, and conflicts of interest are not rare. The primary aim of this dissertation is to increase understanding of the dynamics and controversies of the U.S. and European patent systems, with the focus on the ICT sector. The study consists of three parts. The first part introduces the research topic and the overall results of the dissertation. The second part comprises a publication in which academic, political, legal and business developments that concern software and business-method patents are investigated, and contentiousareas are identified. The third part examines the problems with patents and open standards both of which carry significant economic weight inthe ICT sector. Here, the focus is on so-called submarine patents, i.e. patentsthat remain unnoticed during the standardization process and then emerge after the standard has been set. The factors that contribute to the problems are documented and the practical and juridical options for alleviating them are assessed. In total, the dissertation provides a good overview of the challenges and pressures for change the patent system is facing,and of how these challenges are reflected in standard setting.
Resumo:
Static process simulation has traditionally been used to model complex processes for various purposes. However, the use of static processsimulators for the preparation of holistic examinations aiming at improving profit-making capability requires a lot of work because the production of results requires the assessment of the applicability of detailed data which may be irrelevant to the objective. The relevant data for the total assessment gets buried byirrelevant data. Furthermore, the models do not include an examination of the maintenance or risk management, and economic examination is often an extra property added to them which can be performed with a spreadsheet program. A process model applicable to holistic economic examinations has been developed in this work. The model is based on the life cycle profit philosophy developed by Hagberg and Henriksson in 1996. The construction of the model has utilized life cycle assessment and life cycle costing methodologies with a view to developing, above all, a model which would be applicable to the economic examinations of complete wholes and which would require the need for information focusing on aspects essential to the objectives. Life cycle assessment and costing differ from each other in terms of the modeling principles, but the features of bothmethodologies can be used in the development of economic process modeling. Methods applicable to the modeling of complex processes can be examined from the viewpoint of life cycle methodologies, because they involve the collection and management of large corpuses of information and the production of information for the needs of decision-makers as well. The results of the study shows that on the basis of the principles of life cycle modeling, a process model can be created which may be used to produce holistic efficiency examinations on the profit-making capability of the production line, with fewer resources thanwith traditional methods. The calculations of the model are based to the maximum extent on the information system of the factory, which means that the accuracyof the results can be improved by developing information systems so that they can provide the best information for this kind of examinations.
Resumo:
Lentivirus-based gene delivery vectors carrying multiple gene cassettes are powerful tools in gene transfer studies and gene therapy, allowing coexpression of multiple therapeutic factors and, if desired, fluorescent reporters. Current strategies to express transgenes and microRNA (miRNA) clusters from a single vector have certain limitations that affect transgene expression levels and/or vector titers. In this study, we describe a novel vector design that facilitates combined expression of therapeutic RNA- and protein-based antiangiogenic factors as well as a fluorescent reporter from back-to-back RNApolII-driven expression cassettes. This configuration allows effective production of intron-embedded miRNAs that are released upon transduction of target cells. Exploiting such multigenic lentiviral vectors, we demonstrate robust miRNA-directed downregulation of vascular endothelial growth factor (VEGF) expression, leading to reduced angiogenesis, and parallel impairment of angiogenic pathways by codelivering the gene encoding pigment epithelium-derived factor (PEDF). Notably, subretinal injections of lentiviral vectors reveal efficient retinal pigment epithelium-specific gene expression driven by the VMD2 promoter, verifying that multigenic lentiviral vectors can be produced with high titers sufficient for in vivo applications. Altogether, our results suggest the potential applicability of combined miRNA- and protein-encoding lentiviral vectors in antiangiogenic gene therapy, including new combination therapies for amelioration of age-related macular degeneration.